In this work, we describe a method for large-scale 3D cell-tracking through a segmentation selection approach. The proposed method is effective at tracking cells across large microscopy datasets on two fronts: (i) It can solve problems containing millions of segmentation instances in terabyte-scale 3D+t datasets; (ii) It achieves competitive results with or without deep learning, bypassing the requirement of 3D annotated data, that is scarce in the fluorescence microscopy field. The proposed method computes cell tracks and segments using a hierarchy of segmentation hypotheses and selects disjoint segments by maximizing the overlap between adjacent frames. We show that this method is the first to achieve state-of-the-art in both nuclei- and membrane-based cell tracking by evaluating it on the 2D epithelial cell benchmark and 3D images from the cell tracking challenge. Furthermore, it has a faster integer linear programming formulation, and the framework is flexible, supporting segmentations from individual off-the-shelf cell segmentation models or their combination as an ensemble. The code is available as supplementary material.
Live content is unavailable. Log in and register to view live content