SFU Mathematics of Computation, Application and Data ("MOCAD") Seminar: Gregor Maier
Topic
On the Approximation of Gaussian Lipschitz Functionals
Speakers
Details
Over the past few years, operator learning – the approximation of mappings between infinite dimensional function spaces using ideas from machine learning – has attracted increased research attention. Approximate operators, learned from data, hold promise to serve as efficient surrogate models for problems in scientific computing. Multiple model designs have been proposed so far and their efficiency has been demonstrated in various practical applications. The empirical findings are supported by a (slowly) growing body of theoretical approximation garantuees. The latter focus to a large extent on linear and holomorphic operators. However, far less is known about the approximation of (nonlinear) operators which are merely Lipschitz continuous.
In this talk, I will focus on (scalar-valued) Lipschitz functionals in a Gaussian setting. I will first consider their polynomial approximation by Hermite polynomials and present lower and upper bounds on the best ss-term error. This will be followed by a discussion on the approximation of Lipschitz functionals by arbitrary (adaptive) sampling algorithms, which will result in sharp error bounds. Finally, I will conclude by also addressing the problem of recovering Lipschitz functionals from i.i.d. pointwise samples.
This is joint work with Ben Adcock (SFU).