Poster
UNIKD: UNcertainty-Filtered Incremental Knowledge Distillation for Neural Implicit Representation
Mengqi GUO · Chen Li · Hanlin Chen · Gim Hee Lee
# 284
Recent neural implicit representations (NIRs) have achieved great success in the tasks of 3D reconstruction and novel view synthesis. However, they require the images of a scene from different camera views to be available for one-time training. This is expensive especially for scenarios with large-scale scenes and limited data storage. In view of this, we explore the task of incremental learning for NIRs in this work. We design a student-teacher framework to mitigate the catastrophic forgetting problem. As a result, the student network is able to learn new incoming data while preserving old knowledge simultaneously, aided by a random inquirer and an uncertainty-based filter for effective knowledge merging. Our proposed method is general and thus can be adapted to different implicit representations such as neural radiance field (NeRF) and neural SDF. Extensive experimental results for both 3D reconstruction and novel view synthesis demonstrate the effectiveness of our approach compared to different baselines.
Live content is unavailable. Log in and register to view live content