Skip to yearly menu bar Skip to main content


Poster

Human Hair Reconstruction with Strand-Aligned 3D Gaussians

Egor Zakharov · Vanessa Sklyarova · Michael J. Black · Giljoo Nam · Justus Thies · Otmar Hilliges

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Thu 3 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

We introduce a new hair modeling method that uses a dual representation of classical hair strands and 3D Gaussians to produce accurate and realistic strand-based reconstructions from multi-view data. In contrast to recent approaches that leverage unstructured Gaussians to model human avatars, our method enforces a reconstruction of the hair in the form of 3D polylines, or strands. This fundamental difference allows us to use the resulting hairstyles out-of-the-box in modern computer graphics engines for editing, rendering, and simulation. To reconstruct strand-based hair from images, we introduce a new 3D line lifting method that utilizes unstructured Gaussians to represent the hairstyle's 3D surface. We use this intermediate reconstruction to generate multi-view geometric ground truth data to supervise the fitting of the hair strands. The hairstyle itself is represented in the form of the so-called strand-aligned 3D Gaussians. This representation allows us to combine strand-based hair priors, which are essential for realistic modeling of the inner structure of hairstyles, with the differentiable rendering capabilities of 3D Gaussian Splatting. We evaluate our method on synthetic and real hairstyles and demonstrate state-of-the-art performance in the task of strand-based hair reconstruction.

Live content is unavailable. Log in and register to view live content