Skip to yearly menu bar Skip to main content


Poster

REDIR: Refocus-free Event-based De-occlusion Image Reconstruction

Qi Guo · Hailong Shi · Huan Li · Jinsheng Xiao · Xingyu Gao

# 314
Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ] [ Paper PDF ]
Thu 3 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

The employment of the event-based synthetic aperture imaging (E-SAI) technique, which has the capability to capture high-frequency light intensity variations, has facilitated its extensive application on scene de-occlusion reconstruction tasks. However, existing methods usually require prior information and have strict restriction of camera motion on SAI acquisition methods. This paper proposes a novel end-to-end refocus-free variable E-SAI de-occlusion image reconstruction approach REDIR, which can align the global and local features of the variable event data and effectively achieve high-resolution imaging of pure event streams. To further improve the reconstruction of the occluded target, we propose a perceptual mask-gated connection module to interlink information between modules, and incorporate a spatial-temporal attention mechanism into the SNN block to enhance target extraction ability of the model. Through extensive experiments, our model achieves state-of-the-art reconstruction quality on the traditional E-SAI dataset without prior information, while verifying the effectiveness of the variable event data feature registration method on our newly introduced V-ESAI dataset, which obviates the reliance on prior knowledge and extends the applicability of SAI acquisition methods by incorporating focus changes, lens rotations, and non-uniform motion.

Live content is unavailable. Log in and register to view live content