Hybrid Robust Collaborative Perception with LiDAR-4D Radar Fusion under Adverse Weather Conditions
Abstract
Current collaborative perception systems have significantly improved 3D object detection performance.However, widely used LiDAR and camera systems often suffer performance degradation under adverse weather conditions.The weather-robust 4D radar provides a promising solution to address this challenge.Nevertheless, the effective fusion of sparse 4D radar measurements with degraded LiDAR data remains a significant challenge due to cross-modal corruption and information loss. In this work, we propose a novel hybrid robust collaborative perception framework (HRCP), designed to improve the collaborative perception performance under adverse weather conditions through LiDAR-4D radar fusion.Specifically, we introduce a hybrid collaboration strategy that considers their distinct physical properties and differently processes them during information transmission.Additionally, we propose a bidirectional cross-modal gating (BCMG) module that enables LiDAR and 4D radar to mutually validate feature reliability, ensuring consistent cross-modal representation, and an adaptive feature enhancement (AFE) module that enables comprehensive refinement of degraded and suppressed regions to mitigate information loss.Extensive experimental results demonstrate that our method outperforms previous state-of-the-art approaches under adverse weather conditions.