Zecheng Zhang - Deep Operator Learning Approximation and Distributed Applications - IPAM at UCLA
Presenter
July 16, 2025
Abstract
Recorded 16 July 2025. Zecheng Zhang of the University of Notre Dame presents "Deep Operator Learning Approximation and Distributed Applications" at IPAM's Sampling, Inference, and Data-Driven Physical Modeling in Scientific Machine Learning Workshop.
Abstract: Neural operators are deep learning architectures designed to approximate operators, which are mappings between infinite-dimensional function spaces. They have been widely applied to solve problems involving partial differential equations, such as predicting solutions from given initial or boundary conditions. Despite their empirical success, some theoretical questions remain unresolved. In this talk, we will discuss the analysis of the error convergence and generalization; the results are valid for a broad class of widely used neural operators. These theoretical developments further motivate the design of distributed and federated learning algorithms that leverage the underlying structure of neural operator approximations to address two key challenges in practical applications: (1) handling heterogeneous and multiscale input functions, and (2) extending the framework to a multi-operator learning setting to enable generalization to previously unseen tasks. Numerical evidence regarding those applications will be presented.
Learn more online at: https://www.ipam.ucla.edu/programs/workshops/sampling-inference-and-data-driven-physical-modeling-in-scientific-machine-learning-2/