Skip to yearly menu bar Skip to main content


Poster

Fourier Priors-Guided Diffusion for Zero-Shot Joint Low-Light Enhancement and Deblurring

Xiaoqian Lv · Shengping Zhang · Chenyang Wang · Yichen Zheng · Bineng Zhong · Chongyi Li · Liqiang Nie


Abstract:

Existing joint low-light enhancement and deblurring methods learn pixel-wise mappings from paired synthetic data, which results in limited generalization in real-world scenes. While some studies explore the rich generative prior of pre-trained diffusion models, they typically rely on the assumed degradation process and cannot handle unknown real-world degradations well. To address these problems, we propose a novel zero-shot framework, FourierDiff, which embeds Fourier priors into a pre-trained diffusion model to harmoniously handle the joint degradation of luminance and structures. FourierDiff is appealing in its relaxed requirements on paired training data and degradation assumptions. The key zero-shot insight is motivated by image characteristics in the Fourier domain: most luminance information concentrates on amplitudes while structure and content information are closely related to phases. Based on this observation, we decompose the sampled results of the reverse diffusion process in the Fourier domain and take advantage of the amplitude of the generative prior to align the enhanced brightness with the distribution of natural images. To yield a sharp and content-consistent enhanced result, we further design a spatial-frequency alternating optimization strategy to progressively refine the phase of the input. Extensive experiments demonstrate the superior effectiveness of the proposed method, especially in real-world scenes.

Live content is unavailable. Log in and register to view live content