Im Rahmen des Kolloquiums spricht
Herr Prof. Dr. Ian H. Sloan, University of New South Wales (Sydney)
über
High dimensional approximation – making life easy with kernels
High dimensional approximation problems commonly arise from parametric PDE problems in which the parametric input depends on very many independent univariate random variables. Often (as in the method of “generalized polynomial chaos”, or GPC) the dependence on the parametric variables is approximated by multivariate polynomials, leading to exponentially increasing difficulty and cost (expressed as the “curse of dimensionality”) as the dimension increases. For this reason sparsity of coefficients is a major focus in all implementations of GPC.
In this lecture we develop a different approach, one in which there is no need for sparsification, and no curse of dimensionality. The method, proposed in a 2022 paper with Frances Kuo, Vesa Kaarnioja, Yoshihito Kazashi and Fabio Nobile, uses kernel interpolation with periodic kernels, with the kernels located at lattice points, as advocated long ago by Hickernell and colleagues.
The lattice points and the kernels depend on parameters called “weights”. In the 2022 paper the recommended weights were “SPOD” weights, leading to an L2 error independent of dimension but with a cost growing as the square of the number of lattice points. A newer 2023 paper with Kuo and Kaarnioja introduced “serendipitous” weights, for which the cost grows only linearly with both dimension and number of lattice points, allowing practical computations in as many as 1,000 dimensions. The rate of convergence proved in the above papers was of the order n-α/2, for interpolation using the reproducing kernel of a space with mixed smoothness of order α. However,a new result with Vesa Kaarnioja doubles the proven convergence rate to n-α.
Der Vortrag findet statt am Mittwoch, den 20.03.2024
um 17.15 Uhr im Raum W01 0-006
Kaffee/Tee um 16.45 Uhr im Raum W1 2-213