Deep Learning Seminar

"Input noise injection for improved learning with variants and applications on genomic and image data"

Khalfaoui, Beyrem

Overfitting is a general and important issue in machine learning that has been addressed in several ways through the progress of the field. We re-formalise Input Noise Injection (INI) as a set of increasingly popular regularisation methods. We provide intuitive and theoretical arguments of its regularisation benefits in preventing overfitting and how it can be incorporated in the learning problem. We focus in this context on the dropout trick and provide a novel and simple deterministic approximation both for understanding its mechanism and to derive new set of methods. We present then the Droplasso method as a sparsity inducing variant of dropout, and apply it in the case of single cell RNA-seq data where we show that it can improve accuracy of both Lasso and dropout while performing biologically meaningful feature selection. We finally present Adaptive Structured Noise Injection as a regularisation method for shallow and deep networks, where the noise structure applied on the input of a hidden layer follows the covariance of its activations. We provide a fast algorithm for this particular adaptive scheme, study the regularisation properties of our method on linear and multilayer networks using a quadratic approximation, and show improved results in generalisation performance and in representations disentanglement in real image dataset experiments.
http://www.mines-paristech.fr/Agenda/Soutenance-de-these-de-Beyrem-KHALFAOUI/5846

« back