Skip to yearly menu bar Skip to main content


Poster

Per-Gaussian Embedding-Based Deformation for Deformable 3D Gaussian Splatting

Jeongmin Bae · Seoha Kim · Youngsik Yun · Hahyun Lee · Gun Bang · Youngjung Uh

[ ]
Thu 3 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

As 3D Gaussian Splatting (3DGS) provides fast and high-quality novel view synthesis, it is a natural extension to deform a canonical 3DGS to multiple frames. However, we find that previous works fail to accurately reconstruct dynamic scenes, especially 1) static parts moving along nearby dynamic parts, and 2) some motions are blurry. We attribute the failure to the wrong design of the deformation field which is built as a coordinate-based function despite 3DGS is a mixture of multiple fields centered at the Gaussians. Furthermore, the previous methods consider only single-resolution temporal embeddings. To this end, we define the deformation as a function of per-Gaussian embeddings \textit{and} temporal embeddings. Moreover, we decompose deformation as coarse and fine deformations to model slow and fast movements, respectively. Last but not least, we introduce a training strategy for faster convergence and higher quality. Code will be available online.

Live content is unavailable. Log in and register to view live content