Skip to yearly menu bar Skip to main content


Poster

Deep Patch Visual SLAM

Lahav Lipson · Zachary Teed · Jia Deng

# 184
Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ] [ Paper PDF ]
Fri 4 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Recent work in Visual Odometry and SLAM has shown the effectiveness of using deep network backbones. Despite excellent performance, such approaches are often expensive to run or do not generalize well zero-shot. To address this problem, we introduce Deep Patch Visual-SLAM, a new system for monocular visual SLAM based on the DPVO visual odometry system. We introduce (1) a backend for long-term loop-closure, and (2) a separate mid-term backend with efficient global optimization. On real-world datasets, DPV-SLAM runs at 2x real-time framerates. We achieve the same accuracy as DROID-SLAM on EuRoC while running twice as fast using a third of the VRAM. We also outperform DROID-SLAM by large margins on KITTI and TartanAir.

Live content is unavailable. Log in and register to view live content