Strobl22

Applied Harmonic Analysis and Friends

June 19th - 25th 2022

Strobl, AUSTRIA

"Alpha-rectifying frames: Injectivity and local reconstruction of ReLU-layers"

Haider, Daniel

It will be introduced a new family of frames in $\mathbb{R}^n$ that allow exact reconstruction after rectification of their analysis coefficients, called $\alpha$-rectifying frames. In other words, the $\alpha$-rectified analysis operator of a frame $\Phi = \{\phi_i\}_{i=1}^m$ with this property, defined as \begin{align*} T^*_{\text{ReLU}_{\alpha}}:\mathbb{R}^n&\rightarrow \mathbb{R}^m\\ x&\mapsto \left\{\text{max}\left(0,\langle x,\phi_i\rangle-\alpha_i\right)\right\}_{i=1}^m \end{align*} is a non-linear one-to-one operator. In the context of deep learning these types of operators appear as so-called ReLU-layers in neural networks, where $\phi_i$ and $\alpha_i$ arise as ``learned'' parameters after an optimization procedure. The motivation of this work is to better understand these operators, where in particular, the question of injectivity shall be treated to know when perfect reconstruction of the input data from the output of a layer is possible. For this, a frame-theoretical approach fits perfectly, but has appeared only marginally in the literature so far. Explicitly incorporating a domain $K$ of the input data in addition allows to stay flexible in the cases, where the $\alpha$-rectifying property cannot be guaranteed for whole $\mathbb{R}^n$, since $\alpha$ and $K$ stand in a direct trade-off. All together, the presented approach is chosen to be highly relevant for applications.

« back