Skip to yearly menu bar Skip to main content


From Activation to Initialization: Scaling Insights for Optimizing Neural Fields

Hemanth Saratchandran · Sameera Ramasinghe · Simon Lucey

Arch 4A-E Poster #24
award Highlight
[ ]
Wed 19 Jun 10:30 a.m. PDT — noon PDT


In the realm of computer vision, Neural Fields have gained prominence as a contemporary tool harnessing neural networks for signal representation. Despite the remarkable progress in adapting these networks to solve a variety of problems, the field still lacks a comprehensive theoretical framework. This article aims to address this gap by delving into the intricate interplay between initialization and activation, providing a foundational basis for the robust optimization of Neural Fields. Our theoretical insights reveal a deep-seated connection among network initialization, architectural choices, and the optimization process, emphasizing the need for a holistic approach when designing cutting-edge Neural Fields.

Live content is unavailable. Log in and register to view live content