Bridging RGB and Hematoxylin Components: An Interleaved Guidance and Fusion Framework for Point Supervised Nuclei Segmentation
Abstract
Nuclei instance segmentation in histopathology images is essential for diagnostic accuracy and downstream computational tasks, yet this task relies heavily on expensive pixel level annotations. Although point level annotations substantially reduce the annotation burden for pathologists, many existing methods utilize only a single type of image and overlook the complementary information contained in alternative representations. To address this limitation, we propose DFGNet, a weakly supervised framework that utilizes dual-representation complementary fusion and interleaved guidance learning by jointly modeling RGB images and their corresponding Hematoxylin components. From the complementary fusion perspective, we propose a Reciprocal Cross-scale Dynamic Fusion Module. (RCDF) and an Entropy Confidence Aggregation Unit (ECAU) to integrate multi-scale complementary cues and adaptively combine the outputs of the dual branches. In terms of interleaved guidance, we also propose an Interleaved point-Guided Attention (IGA) that enables bidirectional refinement between the segmentation task and the kernel prediction task. Extensive experiments on three benchmark datasets show that DFGNet achieves state-of-the-art performance across multiple metrics and significantly outperforms existing approaches. DFGNet also demonstrates strong generalization ability across different tissue types and exhibits remarkable robustness to annotation shifts, providing a low-cost and scalable solution for practical clinical applications.