"On the stability of deep convolutional neural networks under irregular or random deformations"Trapasso, S. IvanThe problem of robustness under location deformations for deep convolutional neural networks (DCNNs) has been studied in pioneering works, especially for scattering-type architectures, for deformation vector fields $\tau(x)$ with some regularity - at least $C^1$. In a recent note we address this issue for any field $\tau\in L^\infty(\mathbb{R}^d;\mathbb{R}^d)$, without any additional regularity assumption, hence including the case of wild irregular deformations such as a noise on the pixel location of an image. We prove that for signals in multiresolution approximation spaces $U_s$ at scale $s$, whenever the network is Lipschitz continuous (regardless of its architecture), stability in $L^2$ holds in the regime $\|\tau\|_{L^\infty}/s\ll 1$, essentially as a consequence of the uncertainty principle. When $\|\tau\|_{L^\infty}/s\gg 1$ instability can occur even for well-structured DCNNs, and we provide a sharp upper bound for the asymptotic growth rate. The stability results are then extended to signals in the Besov space $B^{d/2}_{2,1}$ tailored to the given multiresolution approximation. We also consider the case of more general time-frequency deformations. Finally, we study the issue of stability in mean when $\tau(x)$ is modeled as a random field (not bounded, in general) with $|\tau(x)|$, $x\in\mathbb{R}^d$, identically distributed variables. |
http://univie.ac.at/projektservice-mathematik/e/talks/Trapasso_2021-06_Online_ICCHA_2021 - Trapasso.pdf |
« back