DistillNeRF: Data-Efficient Initialization of Neural radiance Fields using Knowledge Distillation

DistillNeRF: Data-Efficient Initialization of Neural radiance Fields using Knowledge Distillation

project
20/01/2023
1 min

Table of Content

Neural Radiance Fields (NeRF) learn a high-quality continuous 3D implicit representation of a scene given multiple views. While the approach has gained popularity in Novel View Synthesis (NVS), its vanilla implementation is not suited for realtime applications. This paper presents DistillNeRF, a data-efficient method for initializing and breeding smaller models using Knowledge Distillation (KD). Smaller models naturally benefit from lower inference times but decrease in perceptual quality. DistillNeRF optimizes training time and builds a priority grid from a teacher network as a data-efficient proxy for sampling better training examples reducing the quality loss while using half the memory.

Click on the paper image to read the paper.

screenshot