**Caption:** A detailed scientific poster from Nanjing University titled "Batch Normalization Alleviates the Spectral Bias in Coordinate Networks" by Zhicheng Cai, Hao Zhu, Qiu Shen, Xinnra Wang, and Xun Cao is on display at a conference. The poster outlines their theoretical analysis and contributions in addressing the spectral bias affecting coordinate networks. Key highlights of the poster include: 1. **Introduction & Contribution**: The spectral bias in coordinate networks leads to slower learning of high-frequency components, limiting signal representation. The team proposes the use of batch normalization (BN) to alleviate this issue. 2. **Theoretical Analysis**: Detailed mathematical formulations describe how variances in NTK (Neural Tangent Kernel) eigenvalues lead to spectral bias. The poster further explains how adding BN modifies these eigenvalues and reduces their pathological distributions. 3. **Empirical Results**: Histograms of NTK eigenvalues and reconstructed images demonstrate the effectiveness of BN in reducing spectral bias. Various graphs and sample images support their claims. 4. **Application of State-of-the-Art Techniques**: The research incorporates cutting-edge methods in 3D shape representation and stereo neural field optimization. The backdrop shows the high-tech conference environment with numerous researchers and academics engaged in discussions. Text transcribed from the image: FOR BAL NIVE 南京大学 NANJING UNIVERSITY Introduction: ➤ Spectral Bias of Coordinate Networks Batch Normalization Alleviates the Spectral Bias in Coordinate Networks Coordinate networks suffer from the spectral bias, which makes the model prefer low-frequency components while learn high-frequency components slowly, resulting in limited preformance of signal representation. ➤ Contribution: 1.Propose to use batch normalization (BN) to alleviate the spectral bias. 2.Theoretically and empirically prove that the pathological distribution of NTK eigenvalues could be significantly improved by using BN (Fig 1), enabling to represent high-frequency components effectively (Fig 2). 3.State-of-the-art performance in various tasks, such as 2D image fitting, 3D shape representation, and 5D neural radiance field optimization. 103 1020 - 2013 2010 100 1000 without BN 1075 MLP 10¹³ PEMLP-1 BN+MLP 1020 BN PEMLP-1 PEMLP-5 10 BN PEMLP-5 2013 1015 1030 1010 10 103 1000 10 10 10 10 10 10 10 10 10 Eigenvalues Fig 1. Histograms of NTK Eigenvalues 10 10 10 10 10 10 10 10 Eigenvalues 10 10 10 10 10 10 10 10 Eigenvalues Zhicheng Cai, Hao Zhu*, Qiu Shen, Xinran Wang, Xun Cao School of Electronic Science and Engineering, Nanjing University Corresponding author: zhuhao_photo@nju.edu.cn Theoretical Analysis: ➤NTK K (Vof(X; 0)) Vef(X; 0) simulates the training dynamic of network f(X;0) Network output after training iterations can be approximated as: Y()(I-ekn)Y The training error of different frequencies is determined by the eigenvalues of NTKA: K=QAQ, eKt=Qen₁tQT, QTY()-Ye¯AQTY =1 I: Larger NTK eigenvalues make the corresponding frequency components converge faster. ➤ Calculate the explicit numerical values of NTK matrix with Mean Field Theory: K1 K2 K=(L-1)N 2 K1 K2 K2 L 1 1 +0(√N), K1= L-1- L-1 1=1 K2 K2 Statistical characteristics of NTK eigenvalue distribution: mx~O(N), v~O(N³), Amax ~O(N2) II: The eigenvalues of NTK exhibit a pathological distribution, the maximum eigenvalue is extremely large while most eigenvalues are close to zero, leading to the spectral bias. Add Batch Normalization to the neural network: fBN (X;0) = H-H + B σ =E[H], σ=√E[(H)2] (E[H])2 KBN =(Vef (X; 0))Vof BN (X;0) (VH) VH 02 HH(VH) VH Τσι Calculate the numerical values of normalized NTK matrix with Mean Field Theory: 23.25 dB 27.18 dB 28.79 dB KBN =(L-1)N 1 H H T (1-2)(T-1) (*1-2)(1-1) Experimental Resu ➤ Frequency-specific a ReLU 2D Image Fitting 30.15 dB GT SIREN 3D Shape Represent +0(√N) GT SIREN K2K1 5D NeRF Optimizat 525(-2)(-1) 29.39 c with BN 30.05 dB MLP 30.84 dB PEMLP-1 Fig 2. Reconstructed Images 31.31 dB PEMLP-5 Statistical characteristics of BN-NTK eigenvalue distribution: m~O(N), vx~O(N), O(N)<λmaz ≤O(N³) III: Batch normalization alleviates the spectral bias by making a shift of NTK's eigenvalues distribution from a lower one to a higher one, facilitating the convergence of high-frequency components and improving the representational performance. GT Plenoxel