Skip to yearly menu bar Skip to main content


Batch Normalization Alleviates the Spectral Bias in Coordinate Networks

Zhicheng Cai · Hao Zhu · Qiu Shen · Xinran Wang · Xun Cao

Arch 4A-E Poster #92
[ ]
Fri 21 Jun 5 p.m. PDT — 6:30 p.m. PDT


Representing signals using coordinate networks dominates the area of inverse problems recently, and is widely applied in various scientific computing tasks. Still, there exists an issue of spectral bias in coordinate networks, limiting the capacity to learn high-frequency components. This problem is caused by the pathological distribution of the neural tangent kernel's (NTK's) eigenvalues of coordinate networks. We find that, this pathological distribution could be improved using the classical batch normalization (BN), which is a common deep learning technique but rarely used in coordinate networks. BN greatly reduces the maximum and variance of NTK's eigenvalues while slightly modifies the mean value, considering the max eigenvalue is much larger than the most, this variance change results in a shift of eigenvalues' distribution from a lower one to a higher one, therefore the spectral bias could be alleviated. This observation is substantiated by the significant improvements of applying BN-based coordinate networks to various tasks, including the image compression, computed tomography reconstruction, shape representation, magnetic resonance imaging and novel view synthesis.

Live content is unavailable. Log in and register to view live content