Liu K, Zuazua Iriondo E (2025)
Publication Language: English
Publication Status: Accepted
Publication Type: Unpublished / Preprint
Future Publication Type: Journal article
Publication year: 2025
Publisher: Math. Models Methods Appl. Sci.
Open Access Link: https://arxiv.org/html/2412.01619v1
In this work, we address three non-convex optimization problems associated with the training of shallow neural networks (NNs) for exact and approximate representation, as well as for regression tasks. Through a mean-field approach, we convexify these problems and, applying a representer theorem, prove the absence of relaxation gaps. We establish generalization bounds for the resulting NN solutions, assessing their predictive performance on test datasets and, analyzing the impact of key hyperparameters on these bounds, propose optimal choices.
On the computational side, we examine the discretization of the convexified problems and derive convergence rates. For low-dimensional datasets, these discretized problems are efficiently solvable using the simplex method. For high-dimensional datasets, we propose a sparsification algorithm that, combined with gradient descent for over-parameterized shallow NNs, yields effective solutions to the primal problems.
APA:
Liu, K., & Zuazua Iriondo, E. (2025). Representation and Regression Problems in Neural Networks: Relaxation, Generalization, and Numerics. (Unpublished, Accepted).
MLA:
Liu, Kang, and Enrique Zuazua Iriondo. Representation and Regression Problems in Neural Networks: Relaxation, Generalization, and Numerics. Unpublished, Accepted. 2025.
BibTeX: Download