Skip to yearly menu bar Skip to main content


Poster

Real-time Holistic Robot Pose Estimation with Unknown States

Shikun Ban · Juling Fan · Xiaoxuan Ma · Wentao Zhu · YU QIAO · Yizhou Wang

# 134
[ ] [ Project Page ] [ Paper PDF ]
Wed 2 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract: Estimating robot pose from RGB images is a crucial problem in computer vision and robotics. While previous methods have achieved promising performance, most of them presume full knowledge of robot internal states, \eg ground-truth robot joint angles. However, this assumption is not always valid in practical situations. In real-world applications such as multi-robot collaboration or human-robot interaction, the robot joint states might not be shared or could be unreliable. On the other hand, existing approaches that estimate robot pose without joint state priors suffer from heavy computation burdens and thus cannot support real-time applications. This work introduces an efficient framework for real-time robot pose estimation from RGB images without requiring known robot states. Our method estimates camera-to-robot rotation, robot state parameters, keypoint locations, and root depth, employing a neural network module for each task to facilitate learning and sim-to-real transfer. Notably, it achieves inference in a single feed-forward pass without iterative optimization. Our approach offers a 12$\times$ speed increase with state-of-the-art accuracy, enabling real-time holistic robot pose estimation for the first time.

Live content is unavailable. Log in and register to view live content