Anthony Nouy (Centrale Nantes - Nantes Université)

SeMath
Colloquium

Title: Optimal sampling for linear and nonlinear approximation
Abstract: We consider the approximation of functions from point evaluations, using linear or nonlinear approximation tools. For linear approximation, recent results show that weighted least-squares projections allow to obtain quasi-optimal approximations in $L^2$ with near to optimal sampling budget. This can be achieved by drawing i.i.d. samples from suitable distributions (depending on the linear approximation tool) and subsampling methods [1,2].
In a first part of this talk, we review different strategies based on i.i.d. sampling and present alternative strategies based on repulsive point processes that allow to perform the same task with a reduced sampling complexity [3].
In a second part, we show how these methods can be used to approximate functions with nonlinear approximation tools, in an active learning setting, by coupling iterative algorithms on manifolds and optimal sampling methods for the (quasi-)projection onto successive linear spaces [4].
The proposed algorithm can be interpreted as a stochastic gradient method using optimal sampling, with provable convergence properties under classical convexity and smoothness assumptions. It can also be interpreted as a natural gradient descent on a manifold embedded in $L^2$, which appears to be a Newton-type algorithm when written in terms of the coordinates of a parametrized manifold.
We present applications of this algorithm to learning of neural networks and tree tensor networks.

These are joint works with R. Gruhlke, B. Michel, C. Miranda and P. Trunschke

References:
[1] M. Sonnleitner and M. Ullrich. On the power of iid information for linear approximation. Journal of Applied and Numerical Analysis, 1(1):88–126, Dec. 2023.
[2] C. Haberstich, A. Nouy, and G. Perrin. Boosted optimal weighted least-squares, Mathematics of Computation, 91(335) (2022), 1281–1315.
[3] A. Nouy and B. Michel. Weighted least-squares approximation with determinantal point processes and generalized volume sampling. arXiv:2312.14057.
[4] R. Gruhlke, A. Nouy, and P. Trunschke. Optimal sampling for stochastic and natural gradient descent.  arXiv:2402.03113.

(Host: Markus Bachmayr)

 

Back