A detailed research poster from Nanjing University's School of Electronic Science and Engineering, presented by Zhicheng Cai and colleagues, addresses the issue of spectral bias in coordinate networks. Titled "Batch Normalization Alleviates the Spectral Bias in Coordinate Networks," the poster includes a comprehensive introduction about spectral bias and its impact on the performance of coordinate networks. The key contribution of the research is the proposal to use batch normalization (BN) to alleviate this bias. Theoretical analyses provide a deep dive into the Neural Tangent Kernel (NTK) and the role of BN in adjusting the NTK eigenvalue distributions. Detailed diagrams and graphs illustrate the results; for instance, histograms displaying NTK eigenvalues and reconstructed images showcasing the improvements achieved by BN application. There are also comparisons between images reconstructed with and without BN, demonstrating significant visual enhancements. Furthermore, the poster includes mathematical equations and statistical analyses underscoring the benefits of BN in improving representational performance in various tasks, such as 2D image fitting and 3D shape representation. Another section highlights experimental results, showing the efficacy of frequency-specific reconstructions. The layout is methodically organized, facilitating a clear understanding of complex concepts, with practical implications for signal representation performance as evidenced by visual data and empirical results. The poster is displayed at a professional conference, suggesting active engagement with the academic and technical community. Text transcribed from the image: 南京大学 Batch Normalization Alleviates the Spectral Bias in Coordinate Networks Zhicheng Cai, Hao Zhu*, Qiu Shen, Xinran Wang, Xun Cao School of Electronic Science and Engineering, Nanjing University Corresponding author: zhuhao_photo@nju.edu.cn Introduction: * Spectral Bias of Coordinate Networks Coordinate networks suffer from the spectral bias, which makes the model prefer low-frequency components while learning high-frequency components slowly, resulting in limited performance of signal representation. * Contribution: 1. Propose to use batch normalization (BN) to alleviate the spectral bias. 2. Theoretically and empirically prove that the pathological distribution of NTK eigenvalues could be significantly improved by using BN (Fig.1), enabling to represent high-frequency components. 3. State-of-the-art performance in various tasks, such as 2D image fitting, 3D shape representation, and 3D neural radiance field optimization. Fig 1. Histograms of NTK Eigenvalues Theoretical Analysis: * NTK NTK = ⟨▽_θf(x); ▽_θf(x’)⟩, simulates the training dynamic of network f(x; θ) * The network output after t training iterations can be approximated as: y_t(x^(i)) = [U diag(e_tλ_j)U^T][y_0(x^(i))] The training error of different frequencies is determined by the eigenvalues of NTK Λ * K = QΛQ^T, K^BN = Φ^T H Φ, y_0 = 0→ y^BN =>^mse Q^T y 1. Larger NTK eigenvalues: the corresponding frequency components converge faster. K = Calculate the explicit numerical values of NTK matrix with Mean Field Theory: K = (Lσ^2_w E ξϕ(ξ)) T + σ^2_b, H = σ^2_w E ϕ(ξ) ϕ(ξ) + σ^2_b Statistical characteristics of NTK eigenvalue distribution: m_Λ = O(Λ), λ_max = O(N) λ_minsimeq (N) II: The eigenvalues of NTK exhibit a pathalogical distribution, the variance is proportional to the input data dimension, leading to the bias in learning towards low-frequency components; Add Batch Normalization: h^BN = Φ(1/N ∑_im=1 μ_(bi) ) μ_x^BN = E(h_X + β), h_x^BN = h_X / √Var(h_X) Calculate the numerical values of normalized NTK matrix with Mean Field Theory: K^BN = (L - 1) σ^2_w E ( 1 / T H ) + Λσ^2_w E Φ + 1 / T I E ( 1 / h ) + σ^2_b Statistical characteristics of BN-NTK eigenvalue distribution: m_Λ^BN = O( ∇V ), λ_max^BN = O(V), λ_min^BN = λ_max / s III: Batch normalization alleviates the spectral bias by making a shift of NTK’s eigenvalues distribution from λ_max / λ_min to a higher one. Facilitating the convergence of high-frequency components and improving the representational performance. Fig 2. Reconstructed Images Experimental Results Frequency-specific Fitting 2D Image Fitting 3D Shape Representation