Current Mixed Reality systems aim to estimate a user's full-body joint configurations from just the pose of the end effectors, primarily head and hand poses. Existing methods often involve solving inverse kinematics (IK) to obtain the full skeleton from just these sparse observations, usually directly optimizing the joint angle parameters of a human skeleton. Since this accumulates error through the kinematic tree, predicted end effector poses fail to align with the provided input pose, leading to discrepancies in predicted and actual hand positions or feet that penetrate the ground. In this paper, we first refine the commonly used SMPL parametric model by embedding anatomical constraints that reduce the degrees of freedom for specific parameters to more closely mirror human biomechanics. This ensures that our model leads to physically plausible pose predictions. We then propose NeCoIK, a fully differentiable neural constrained IK solver for full-body motion tracking using just a person's head and hand. NeCoIK is based on swivel angle prediction and perfectly matches input poses while avoiding ground penetration. We evaluate NeCoIK in extensive experiments on motion capture datasets and demonstrate that our method surpasses the state of the art in quantitative and qualitative results at fast inference speed.
Live content is unavailable. Log in and register to view live content