From Frequency Bias to Spectral Balance: Operator-Aware Preconditioners for PINNs
math.NA
/ Authors
/ Abstract
When neural networks (NNs) are used as a type of nonlinear parametric representation to solve partial differential equations (PDEs), they often display frequency-dependent learning dynamics that can differ from those seen in direct function approximation tasks, resulting from a balance between the frequency bias of the NN representation and that of the underlying differential operator. Although many commonly used NNs exhibit a bias towards low-frequency modes in representation, the presence of differential operators in the loss function, which amplifies high-frequency components, can lead to high frequency bias. In this work, using second order elliptic PDEs as an example, we show how these two factors compete and lead to an overall frequency bias in different situations. Once the balance is determined, it is important to design computational strategies to counter the resulting bias to improve training efficiency. We propose a simple operator-aware preconditioning strategy that rebalances the optimization landscape and the learning dynamics by applying an auxiliary integral operator to the residual. The integral kernel can be the Green's function of a reference elliptic operator or an approximation, and integrates easily with common NN solvers for PDEs. Extensive experiments, including multiscale and variable-coefficient problems, show that the approach restores more balanced learning dynamics across modes and substantially improves both convergency and accuracy.