"Universal sparsity of deep ReLU networks"Elbrächter, DennisIn recent years Deep Learning has been successfully applied to avariety of very different problems too numerous to fit on a single page. While this might very well constitute its most attractive feature in practice, understanding why it is so universally useful remains a compelling challenge. We try to approach the issue from a sparsity point of view which leads to a remarkable approximation theoretic universality property of deep neural networks. We introduce (or assimilate) a number of key concepts, which allows us to compare neural networks to classical representation systems (meaning e.g. wavelets, shearlets, and Gabor systems, or more generally any system generated from some mother function through translation, dilation and modulation). This enables us to establish that any function class is (asymptotically) at least as sparse w.r.t. (ReLU) neural networks, as it is in any ’reasonable’ classical representation system. |
« back