Strobl24

More on Harmonic Analysis

June 9th - 15th 2024

Strobl, AUSTRIA

"CNNs going OFF: Stability of Convolutional Layers via Oversampled Filterbank Frames"

Haider, Daniel

Data-driven feature representations obtained via gradient updates have become the dominant paradigm for the front-end processing of audio signals in deep learning (aka. end-to-end). The flexibility and adaptivity that come with it have shown to be effective in practice. However, only little is known about the stability behavior throughout training. In this work, we study the stability of convolutional layers with 1D filters under gradient descent through the perspective of oversampled filterbank frames. We adapt classical results from frame theory on tightness to filterbanks with filters of fixed support, considering different techniques used in practice, such as decimation, dilation, and residual connections. With these insights, we propose a preconditioning procedure and a numerically efficient regularizing method that keeps filterbanks tight under gradient updates consistently. We demonstrate them in numerical experiments where we train randomly initialized filterbanks to approximate the responses of given tight auditory filterbanks.

« back