The image depicts an academic conference poster from Nanjing University, specifically from the School of Electronic Science and Engineering. The poster, titled "Batch Normalization Alleviates the Spectral Bias in Coordinate Networks," is authored by Zhicheng Cai, Hao Zhu, Qiu Shen, Xinran Wang, and Xun Cao. The focus of the research is on addressing the spectral bias in coordinate networks through batch normalization (BN). -Key sections in the poster include: 1. **Introduction:** - Spectral Bias of Coordinate Networks: Discusses how coordinate networks prefer low-frequency components, resulting in suboptimal representation of high-frequency components. 2. **Contributions:** - Proposing the use of BN to alleviate spectral bias. - Empirical and theoretical evidence supporting this proposal. - Demonstrations of state-of-the-art performance in tasks such as 2D image fitting, 3D shape representation, and neural radiance field optimization. 3. **Theoretical Analysis:** - Detailed mathematical formulations and analyses. - Histograms of NTK (Neural Tangent Kernel) eigenvalues and comparisons of image reconstructions with and without BN. 4. **Figures:** - Fig 1: Histograms illustrating the distribution of NTK eigenvalues. - Fig 2: Reconstructed images comparing different methods, showcasing clear distinctions in image quality. 5. **Corresponding Author:** - A contact email is provided for further inquiries: zhuhao_photo@nju.edu.cn. The poster provides a blend of theoretical insights and practical results, demonstrating the effectiveness of batch normalization in mitigating spectral bias in coordinate networks. The visuals and mathematical content are neatly arranged, making it accessible for attendees to comprehend and engage with the presented research. Text transcribed from the image: Batch Normalization Alleviates the Spectral Bias in Coordinate Networks Zhicheng Cai, Hao Zhu*, Qiu Shen, Xinran Wang, Xun Cao School of Electronic Science and Engineering, Nanjing University Corresponding author: zhuhao_photo@nju.edu.cn Introduction: - Spectral Bias of Coordinate Networks Coordinate networks suffer from the spectral bias, which makes the model prefer low-frequency components while learn high-frequency components slowly, resulting in limited performance of signal representation. Contribution: 1. Propose to use batch normalization (BN) to alleviate the spectral bias. 2. Theoretically and empirically prove that the pathological distribution of NTK eigenvalues could be signif cantly improved by using BN (Fig 1), enabling to represent high-frequency components effectively (Fig 2). 3. State-of-the-art performance in various tasks, such as 2D image fitting, 3D shape representation, and 5D natural scene field optimization. Fig 1. Histograms of NTK Eigenvalues Fig 2. Reconstructed Images Theoretical Analysis: NTK K = ⟨ ( ), ⟩ simulates the training dynamic of network ⟨ ( ) ⟩ Network output at training iterations can be approximated as ⟨x ⟩ = ⟨ ⟩ +⟨ ( ( ) ⟩ The training error of different frequencies is dominated by the eigenvalues of NT K : K = QAQ , ,q= ,A=diag( ,…, ) ,YQ =Y I: Larger NTK eigenvalues make the corresponding frequency components converge faster. Calculate the explicit numerical values of NT K matrix with Mean Field Theory: K=( ) ⟨ ( ) ( ) ⟩ ⟨ ⟩ Statistical characteristics of NTK eigenvalue distribution: The eigenvalues of NTK exhibit a pathological distribution, the maximum eigenvalue is extremely large while most eigenvalues are close to zero, leading to the spectral bias. Add Batch Normalization to the neural network: ⟨( ) ⟩ = ⟨ ( ) ⟩ =1 = ⟨ ( ), ⟩ ⟨ ⟩ + =1 ( ) Calculate the numerical values of normalized NTK matrix with Mean Field Theory: K BN =( ( ) )⟨ ( ) ⟩ Statistical characteristics of BN-NTK eigenvalue distribution: Batch normalization alleviates the spectral bias by making a shift of NTK’s eigenvalue distribution from a lower one to a higher one, facilitating the convergence of high-frequency components and improving the representational performance.