No. 2025.07
Non-Asymptotic Analysis of Projected Gradient Descent for Physics-Informed Neural Networks
J. Nießen and J. Müller
Subject: Physics-informed neural networks, neural tangent kernel, reproducing kernel Hilbert space

Abstract

In this work, we provide a non-asymptotic convergence analysis of projected gradient descent for physics-informed neural networks for the Poisson equation. Under suitable assumptions, we show that the optimization error can be bounded by $\mathcal{O}(1/\sqrt{T} + 1/\sqrt{m} + \epsilon_{\textup{approx}})$, where $T$ is the number of algorithm time steps, $m$ is the width of the neural network and $\epsilon_{\textup{approx}}$ is an approximation error. The proof of our optimization result relies on bounding the linearization error and using this result together with a Lyapunov drift analysis. Additionally, we quantify the generalization error by bounding the Rademacher complexities of the neural network and its Laplacian. Combining both the optimization and generalization results, we obtain an overall error estimate based on an existing error estimate from regularity theory.

Download

arXiv:2505.07311