Skip to yearly menu bar Skip to main content


Poster

SweepNet: Unsupervised Learning Shape Abstraction via Neural Sweepers

Mingrui Zhao · Yizhi Wang · Fenggen Yu · Changqing Zou · Ali Mahdavi-Amiri

# 308
Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ] [ Paper PDF ]
Thu 3 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Shape abstraction is an important task for simplifying complex geometric structures while retaining essential features. Sweep surfaces, commonly found in human-made objects, aid in this process by effectively capturing and representing object geometry, thereby facilitating abstraction. In this paper, we introduce SweepNet, a novel approach to shape abstraction through sweep surfaces. We propose an effective parameterization for sweep surfaces, utilizing superellipses for profile representation and B-spline curves for the axis. This compact representation, requiring as few as 14 float numbers, facilitates intuitive and interactive editing while preserving shape details effectively. Additionally, by introducing a differentiable neural sweeper and an encoder-decoder architecture, we demonstrate the ability to predict swept volume representations without supervision. We show the superiority of our model through several quantitative and qualitative experiments throughout the paper.

Live content is unavailable. Log in and register to view live content