The image depicts a white board with large words written on it, likely containing important information or instructions. The board is placed on top of a table, and it appears to be in a conference or meeting room. There are several books and papers on the table, and a person is standing in front of the board, possibly preparing to give a presentation or discuss important topics. The room is well-lit, and there are several people in the background, likely attending the meeting or workshop. Text transcribed from the image: FOR BAL NIVE 南京大学 NANJING UNIVERSITY Introduction: ➤ Spectral Bias of Coordinate Networks Batch Normalization Alleviates the Spectral Bias in Coordinate Networks Coordinate networks suffer from the spectral bias, which makes the model prefer low-frequency components while learn high-frequency components slowly, resulting in limited preformance of signal representation. ➤ Contribution: 1.Propose to use batch normalization (BN) to alleviate the spectral bias. 2.Theoretically and empirically prove that the pathological distribution of NTK eigenvalues could be significantly improved by using BN (Fig 1), enabling to represent high-frequency components effectively (Fig 2). 3.State-of-the-art performance in various tasks, such as 2D image fitting, 3D shape representation, and 5D neural radiance field optimization. 103 1020 - 2013 2010 100 1000 without BN 1075 MLP 10¹³ PEMLP-1 BN+MLP 1020 BN PEMLP-1 PEMLP-5 10 BN PEMLP-5 2013 1015 1030 1010 10 103 1000 10 10 10 10 10 10 10 10 10 Eigenvalues Fig 1. Histograms of NTK Eigenvalues 10 10 10 10 10 10 10 10 Eigenvalues 10 10 10 10 10 10 10 10 Eigenvalues Zhicheng Cai, Hao Zhu*, Qiu Shen, Xinran Wang, Xun Cao School of Electronic Science and Engineering, Nanjing University Corresponding author: zhuhao_photo@nju.edu.cn Theoretical Analysis: ➤NTK K (Vof(X; 0)) Vef(X; 0) simulates the training dynamic of network f(X;0) Network output after training iterations can be approximated as: Y()(I-ekn)Y The training error of different frequencies is determined by the eigenvalues of NTKA: K=QAQ, eKt=Qen₁tQT, QTY()-Ye¯AQTY =1 I: Larger NTK eigenvalues make the corresponding frequency components converge faster. ➤ Calculate the explicit numerical values of NTK matrix with Mean Field Theory: K1 K2 K=(L-1)N 2 K1 K2 K2 L 1 1 +0(√N), K1= L-1- L-1 1=1 K2 K2 Statistical characteristics of NTK eigenvalue distribution: mx~O(N), v~O(N³), Amax ~O(N2) II: The eigenvalues of NTK exhibit a pathological distribution, the maximum eigenvalue is extremely large while most eigenvalues are close to zero, leading to the spectral bias. Add Batch Normalization to the neural network: fBN (X;0) = H-H + B σ =E[H], σ=√E[(H)2] (E[H])2 KBN =(Vef (X; 0))Vof BN (X;0) (VH) VH 02 HH(VH) VH Τσι Calculate the numerical values of normalized NTK matrix with Mean Field Theory: 23.25 dB 27.18 dB 28.79 dB KBN =(L-1)N 1 H H T (1-2)(T-1) (*1-2)(1-1) Experimental Resu ➤ Frequency-specific a ReLU 2D Image Fitting 30.15 dB GT SIREN 3D Shape Represent +0(√N) GT SIREN K2K1 5D NeRF Optimizat 525(-2)(-1) 29.39 c with BN 30.05 dB MLP 30.84 dB PEMLP-1 Fig 2. Reconstructed Images 31.31 dB PEMLP-5 Statistical characteristics of BN-NTK eigenvalue distribution: m~O(N), vx~O(N), O(N)<λmaz ≤O(N³) III: Batch normalization alleviates the spectral bias by making a shift of NTK's eigenvalues distribution from a lower one to a higher one, facilitating the convergence of high-frequency components and improving the representational performance. GT Plenoxel