Strobl22

Applied Harmonic Analysis and Friends

June 19th - 25th 2022

Strobl, AUSTRIA

"Explainable Approximation in high Dimensions: Fourier-based Algorithms meet Kernel Methods"

Nestler, Franziska

Kernel methods have gained much popularity and are an important class of machine learning algorithms. The core of such methods is the so called kernel matrix $K\in\mathbb R^{N\times N}$ with entries $K_{ij}=\kappa(\mathbf x_i,\mathbf x_j)$, where $\kappa$ denotes the kernel function and $\mathbf x_i\in\mathbb R^d$ are the given data points. This matrix is typically dense and large-scale, which makes the mutliplication with it computationally demanding. We recently proposed to realize the multiplication with $K$ based on the nonequispaced fast Fourier transform (NFFT), which can drastically reduce the computational cost. However, the size of a full grid of Fourier coefficients grows exponentially with the dimension $d$ and, hence, classical FFT-based methods are only efficient for small dimensions. The usage of so called ANOVA (analysis of variance) kernels helps us to circumvent the curse of dimensionality. This approach is closely connected with recently proposed algorithms making use of the classical ANOVA decompostion of functions and fast Fourier-based methods. The ANOVA-idea makes the obtained model interpretable and helps identifying relevant features and connections between them. We give an overview on both frameworks and their connection.

« back