Song Y, Yuan X, Yue H (2024)
Publication Language: English
Publication Status: Submitted
Publication Type: Unpublished / Preprint
Future Publication Type: Journal article
Publication year: 2024
URI: https://arxiv.org/abs/2307.00296
Open Access Link: https://arxiv.org/abs/2307.00296
We consider a general class of nonsmooth optimal control problems with partial differential equation (PDE) constraints, which are very challenging due to the nonsmooth objective functionals and the resulting high-dimensional and ill-conditioned systems after discretization. We focus on the application of a primal-dual method, with which different types of variables can be treated individually and thus its main computation at each iteration only requires solving two PDEs. Our target is to accelerate the primal-dual method with either larger step sizes or operator learning techniques. For the accelerated primal-dual method with larger step sizes, its convergence can be still proved rigorously while it numerically accelerates the original primal-dual method in a simple and universal way. For the operator learning acceleration, we construct deep neural network surrogate models for the involved PDEs. Once a neural operator is learned, solving a PDE requires only a forward pass of the neural network, and the computational cost is thus substantially reduced. The accelerated primal-dual method with operator learning is mesh-free, numerically efficient, and scalable to different types of PDEs. The acceleration effectiveness of these two techniques is promisingly validated by some preliminary numerical results.
APA:
Song, Y., Yuan, X., & Yue, H. (2024). Accelerated primal-dual methods with enlarged step sizes and operator learning for nonsmooth optimal control problems. (Unpublished, Submitted).
MLA:
Song, Yongcun, Xiaoming Yuan, and Hangrui Yue. Accelerated primal-dual methods with enlarged step sizes and operator learning for nonsmooth optimal control problems. Unpublished, Submitted. 2024.
BibTeX: Download