Skip to yearly menu bar Skip to main content


Poster

Hetecooper: Feature Collaboration Graph for Heterogeneous Collaborative Perception

Congzhang Shao · Guiyang Luo · Quan Yuan · Yifu Chen · Yilin Liu · Gong Kexin · Jinglin Li

# 51
Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ] [ Paper PDF ]
Fri 4 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Collaborative perception effectively expands the perception range of agents by sharing perceptual information, and it addresses the occlusion problem in single-vehicle perception. However, the existing collaboration methods all perceive the assumption that the model is isomorphic, and when the agent uses different perception model architectures, it will result in differences in the size, number of channels and the semantic space of intermediate features, which brings challenges to the collaboration. We introduce Hetecooper, a collaborative perception framework for scenarios with heterogeneous perception models. Hetecooper models the correlation between heterogeneous features through a feature collaboration graph. This approach retains the complete information of the features and can automatically adapt to changes in feature size. Moreover, we have designed a method based on the graph transformer to facilitate feature messages transfer within the graph. Initially, the semantic space of the nodes is unified through a semantic mapper. Subsequently, neighbor information is aggregated through attention guided by edge weights. Finally, the graph nodes are reorganized into complete features, thereby achieving effective fusion of heterogeneous features. Test results demonstrate that our method achieves superior performance in both model isomorphism and model heterogeneity scenarios, and also exhibits good scalability. Our code will be open-sourced.

Live content is unavailable. Log in and register to view live content