Hands on: Getting started and Nerf frameworks - 3a1b2c3/seeingSpace GitHub Wiki

Table of contents generated with markdown-toc

So after seeing everybody's cool Nerfs you want to try yourself. By now we are in the lucky position to have some real choices when it comes to creating Nerf's.

I will understand a framework as code base that has multiple nerf models rather than just one kind.

TLDR

Your choices will depend on available hardware and ability to install and run mostly python code locally or the cloud (no actual coding required). If you want to use a fish eye camera to capture your choices are more limited as well. Mesh, texture and point cloud export are not standard with all implementations if you expect to use these.

Lumalabs.ai is an easiest no code option to get started. For more control pick one of NVIDIA instant ngp or open source nerfstudio.

There is now a Windows binary version of NVIDIA instant ngp. Should be a beginner friendly option for people with a rtx card.

This is more fun with other users: Join a Nerf discord at https://discord.gg/Mkp2zKzd or the Nerfstudio discord at https://discord.gg/CB2PwjBh and build the future with us.

Picking a Nerf framework

Hardware: NVIDIA rtx graphics card for local run

Nerf's are expensive to compute. Your options will depend on whether you have a good NVIDIA graphics card available. An rtx 3 series (3060, 3070, 3080 or 3090) or 4 series graphics card with at least 8 GB RAM is needed. Those cards are most commonly found in gaming machines or above... Mac's or other computer using non NVIDIA brands of gpus generally won't work with NVIDIA CUDA at all which is used to accelerate Nerf computing and rendering.

Don't be to disappointed if your card doesn't fit the bill. You can still run on the cloud.

Clouds options

Most framework can be run on the cloud but do not provide a gui there, Nerfstudio being the exception.

Individual frameworks have more instructions

Languages and tools

Most of the frameworks use NVIDIA CUDA, specifically Tinycuda and PyTorch and a camera tracker called Colmap.

Running on the cloud

No code cloud app: Nerf as a service Lumalabs.ai (beta)

A very easy to use no code app in closed free beta, not a framework like the others.

You can sign up and just upload images or use their iOS app.

It has rapid development, but since its a beta still unclear what the commercial product will be like...

My recommendation for a quick no code start into Nerfs.

  • No hardware but a phone or any camera and internet connection required
  • Friendly community and discord channel (invitation only)
  • The future pricing and license situation is unclear at this point
  • Mesh, texture and point cloud export
  • Fish eye and insta360 camera support
  • Closed source, but rest api
  • Web viewer doesn't really show nerfs (but renderings and meshes)
  • https://lumalabs.ai/

On your local machine (or hybrid with cloud/Google Colab)

Nerfstudios (Local and cloud)

Very active development and community. Large range of Nerf models Mip-NeRF, Nerfact, Instant-NGP and Semantic NeRF-W

Nerfstudios is my recommendation if you are comfortable with git and running some python. Seems the most future proof at this point.

NVIDIA Instant ngp, nvdiffrec and Kaolin wisp library

NVIDIA Instant nerf

Is the original paper and comes with a gui. Fast and easy to use. All written in CUDA, needs modest tech skills to build and not very suitable for custom development. Can do some rendering but has little active development now and probably wont be a product ever. Offers a single nerf model and some sdf, not really a frame work but still a good way to start with Nerfs.

  • There is now a Windows binary version of instant NGP. Just unzip and start training https://youtu.be/TA14yYBIRP8

  • https://github.com/NVlabs/instant-ngp

  • NVIDIA instant Nerf uses a commercial license, please check it here

  • Large community, some documentation, some user support in their github

  • Local rendering and gui require a rtx 3 series (3060-3090) or 4 series graphics card

  • No Colab support but its possible to run it there. See my tests Step by step

  • Limited/low res mesh export and fisheye support

Kaolin wisp: A PyTorch Library and Engine for Neural Fields Research

Setup is a bit involved but it has a nice gui, great speed and fast iterations. More research than creator oriented.
Good for custom coding, library gets some active development.

  • Proprietary license, free for non-commercial use
  • Some user support in their github https://github.com/NVIDIAGameWorks/kaolin-wisp
  • No Colab/Cloud support
  • No mesh, texture and point cloud export
  • No Fisheye and insta360 camera support

nvdiffrec: mesh and light reconstruction from images

Technically not Nerf but joint optimization of topology, materials and lighting from multi-view image observations as described in the paper Extracting Triangular 3D Models, Materials, and Lighting From Images.

No longer under development

Recovering env maps from unmasked input

https://nvlabs.github.io/nvdiffrec/

https://github.com/NVlabs/nvdiffrec

NeRF-Factory

I haven't looked deeply into it but there is also pytorch based https://github.com/kakaobrain/NeRF-Factory

Also seems more research than creator oriented.

This contains PyTorch-implementation of 7 popular NeRF models * NeRF: [Project Page] [Paper] [Code]

Kornia

Kornia is a differentiable computer vision library for PyTorch. I haven't tried. Let me know if you do

https://github.com/kornia/kornia/releases/tag/v0.6.8

  • Apache License

Pytorch3d

Some work in pytorch3d, i haven't tried. Let me know if you do

https://github.com/facebookresearch/pytorch3d/tree/main/projects/nerf

Google research

I haven't tried. Let me know if you do

Colab notebook about Voxel-Based Radiance Fields using JAX and FLAX

https://github.com/google-research/google-research/blob/master/trainable_grids/Voxel_based_Radiance_Fields.ipynb

NeRF datasets

Some tips capturing Nerf data

Popular NeRF datasets

If you done want to use your own data here are some proven ones. License may vary

⚠️ **GitHub.com Fallback** ⚠️