Multi-Scale Gradient-Guided Unrolling Architecture with Adaptive Mamba for Compressive Sensing
Abstract
In the field of Compressive Sensing (CS), deep unrolling networks (DUNs) have demonstrated exceptional performance and interpretability by integrating traditional optimization solvers with deep networks. However, existing DUNs suffer from homogenization in cross-stage feature extraction and insufficient integration of gradient-guided information. Additionally, the feature extraction module struggles to balance the global receptive field and computational efficiency, which limits improvements in image reconstruction details. To address these challenges, we propose a multi-scale gradient-guided unrolling architecture with adaptive Mamba for CS, named MambaCS. Specifically, we utilize our customized Adaptive State-Space Block (A-SSB) to unroll the well-known Proximal Gradient Descent (PGD) algorithm across multiple feature levels to extract comprehensive image features while maintaining computational efficiency. Moreover, we design a High-Dimensional Gradient Fusion (HDGF) that ensures the persistent and stable injection of gradient-guided information across various scales and dimensions, while effectively eliminating information bottlenecks. Finally, we develop a Feature-Adaptive Proximal Operator (FAPO), using A-SSB as an extension of the sparse basis associated with the PGD proximal operator, which enhances sensitivity to multi-scale features and improves detail reconstruction. Extensive experiments demonstrate the signifcant advantages of our proposed MambaCS over the current SOTA methods.