CMX Student/Postdoc Seminar

Friday November 20, 2020 1:00 PM

Multiscale Computation and Parameter Learning for Kernels from PDEs: Two Provable Examples

Speaker: Yifan Chen, Applied and Computational Mathematics, Caltech
Location: Online Event

This talk is concerned with the computation and learning of kernel operators from a PDE background. The standard mathematical model is Lu=f, where L is the inverse of some kernel operator; u and f are functions that may or may not be directly available to us, depending on the problem set-up.

In the first part, we consider the computation problem: given L and f, compute u. Here L can be heterogeneous Laplacians, or Helmholtz's operators in the high-frequency regime. For this problem, we develop a multiscale framework that achieves nearly exponential convergence of accuracy regarding the computational degrees of freedom. The main innovation is an effective coarse-fine scale decomposition of the solution space that exploits local structures of both L and f.

In the second part, we consider the learning problem: given u at some scattered points only, the task is to recover the full u and learn the operator L that encodes the underlying physics. We approach this problem via Empirical Bayes and Kernel Flow methods. Analysis of their consistency in the large data limit, as well as explicit identification of their implicit bias in parameter learning, are established for a Matern-like model on the torus, both theoretically and empirically.

Contact: Jolene Brink at 6263952813 jbrink@caltech.edu
For more information visit: http://cmx.caltech.edu/