Skip to yearly menu bar Skip to main content


Poster

DynoSurf: Neural Deformation-based Temporally Consistent Dynamic Surface Reconstruction

Yuxin Yao · Siyu Ren · Junhui Hou · Zhi Deng · Juyong Zhang · Wenping Wang

[ ] [ Project Page ]
Wed 2 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

This paper explores the problem of reconstructing temporally consistent surfaces from a 3D point cloud sequence without correspondence. To address this challenging task, we propose DynoSurf, an unsupervised learning framework integrating a template surface representation with a learnable deformation field. Specifically, we design a coarse-to-fine strategy for learning the template surface based on the deformable tetrahedron representation. Furthermore, we propose a learnable deformation representation based on the learnable control points and blending weights, which can deform the template surface non-rigidly while maintaining the consistency of the local shape. Experimental results demonstrate the significant superiority of DynoSurf over current state-of-the-art approaches, showcasing its potential as a powerful tool for dynamic mesh reconstruction. The code will be publicly available.

Live content is unavailable. Log in and register to view live content