NVIDIA instant Nerf on google colab, train a nerf without a massive gpu - 3a1b2c3/seeingSpace GitHub Wiki

Because my gtx graphics card is a bit too old to train instant Nerf locally I am working on a google colab setup for NVIDIA Instant Nerf, so you can try before buying a new one... Google Colab is free although you can get Pro subscription.

NVIDIA Instant Nerf for the rest of us

https://github.com/NVlabs/instant-ngp is probably the first commercial Nerf application out there. Let's get started.

Note: NVIDIA instant Nerf uses a commercial license, please check it here

Need advice picking the right Nerf framework: Getting started-and Nerf frameworks

TLDR: the theory

Instant nerf reduces the computational cost with a versatile new input encoding that permits the use of a smaller network without sacrificing quality, thus significantly reducing the number of floating point and memory access operations: a small neural network is augmented by a multiresolution hash table of trainable feature vectors whose values are optimized through stochastic gradient descent. Video

Read the paper: here.

The long version and more theory about neural rendering is here

Train your Nerf (for free) on Google's Colab

Instant Nerf needs an NVIDIA graphics card with 8 GB memory but it is also possible to run on Goolge's free colab offering. My smaller gtx 1090 card is theoretically supported but i had CUDA memory errors all the time. So i found this discussion.

https://colab.research.google.com/drive/10TgQ4gyVejlHiinrmm5XOvQQmgVziK3i?usp=sharing

We can do all the steps but the network training and rendering on our local machine.

  • **Download the juniper notebook The included Fox example trained in a bit over half an hour to train on Colab for me, so you will loose some "instant": Training: 100% 99995/100000 [37:44<00:00, 44.16step/s, loss=0.000291]

image

Step by step

Overview

  1. Find camera positions in your images with Colmap, i did that on my local computer but nothing wrong with running that python script on the cloud. Most existing data already has a transforms.json files with that data
  2. Train the network (via a notebook) from images, optional save a snapshot .mgspack file (in Colab on the cloud)
  3. Export a mesh or render from a new camera (in Colab on the cloud) using the snapshot file is fast

Prepare and upload your source data (on Goolge drive)

I am using this CC licensed footage.

Here is the result:

tilt

You can download the data here if you do not want to build it and run Colmap yourself. I also uploaded the zipped .obj mesh result.

Run the training in Google Colab

  • You can open a notebook in google drive and you are able to run it in in Colab
  • Make sure your Colab runtime has a gpu by choosing the connect option (under Runtime menu):
  • Run all cells in the notebook
  • The code connects to your google drive and will prompt for your login, for alternatives see link below in Colab resources. This will mount the drive to a file path so can access it like any other folder

Google Colab File tab

  • Your notebook will first install and build instant Nerf from NVIDIA's github, that can take a few minutes

  • We are using the python wrapper script run.py since it gives you more options than the testbed.exe to run without gui.

You can run a cell with just the help !python ./scripts/run.py -h command to see all script options

  • I am initially using something like !python ./scripts/run.py --mode nerf --scene /content/drive/MyDrive/ColabNERF/table/images --save_snapshot /content/drive/MyDrive/ColabNERF/table/snap/mySnap.mgspack

Training Progress displayed in the notebook

Save snapshots, images and models

Training/Snapshots (.msgpack)

  • A binary snapshot file lets you re-run a Nerf without re-training or continue training, mine turned out some 90 MB I just keep the snap shot on google drive.

  • You can also download the snapshot and use it with gui on yor local machine if you have a local build of instant nerf

  • from google drive png width and screenshot_dir parameter have to be specified for this to work

The syntax to train, save and render a model is somewhat like

# some variables
scene="/content/sample_data" # the folder containing transform.json
screenshot_dir="/content/sample_data/renders" 
snap="/content/sample_data/snap/chair_abb8.msgpack"
width=1080
height=1080
!python ./scripts/run.py --mode nerf --scene {scene} --width {width}  --height {height}   --screenshot_dir {screenshot_dir}  --save_snapshot {snap1}
# --load_snapshot {snap1} 

![image](https://user-images.githubusercontent.com/74843139/171987438-f71de8c6-e76b-4fc2-a662-410107b35673.png)

Exporting Meshes (obj/ply)

Meshes are generated using marching cubes. To export one add this flag to your run.py:

 --save_mesh /content/drive/MyDrive/ColabNERF/table/render/m.obj 
  • obj model raymarching

Rendering (--screenshot_transforms)

To animate a camera you need a json file with per frame camera settings similar to transforms.json from colmap, i included my as camera.json. Somebody made a tool that makes this easier here https://nerf-pathtools.netlify.app/

image

Download files or drive

Missing script but nice in gui

  • Camera editor
  • Crop aabb, there is a bug https://github.com/NVlabs/instant-ngp/issues/683 testbed.render_aabb = ngp.BoundingBox([0.25, -1.5, -1.5], [1.5, 2.5, 2.5]) to presumably set the initial rendering confines, and testbed.compute_and_save_marching_cubes_mesh(args.save_mesh, [res, res, res], ngp.BoundingBox([0.25, -1.5, -1.5], [1.5, 2.5, 2.5])

Inspiration

Please feel free to contribute or call out any errors

Colab resources