Skip to yearly menu bar Skip to main content


Poster

Preconditioners for the Stochastic Training of Neural Fields

Shin-Fang Chng · Hemanth Saratchandran · Simon Lucey


Abstract:

Neural fields encode continuous multidimensional signals as neural networks, enabling diverse applications in computer vision, robotics, and geometry. While Adam is effective for stochastic optimization, it often requires long training times. To address this, we explore alternative optimization techniques to accelerate training without sacrificing accuracy. Traditional second-order methods like L-BFGS are unsuitable for stochastic settings. We propose a theoretical framework for training neural fields with curvature-aware diagonal preconditioners, demonstrating their effectiveness across tasks such as image reconstruction, shape modeling, and Neural Radiance Fields (NeRF).

Live content is unavailable. Log in and register to view live content