Skip to main content

SEMINAR by Rahul Parhi

isba
Louvain-la-neuve
More information

Rahul Parhi (Ecole Polytechnique de Lausanne - EPFL)

will give a presentation on

Deep Learning Meets Sparse Regularization
 

Abstract:

Deep learning has been wildly successful in practice and most state-of-the-art artificial intelligence systems are based on neural networks. Lacking, however, is a rigorous mathematical theory that adequately explains the amazing performance of deep neural networks.
In this talk, I present a new mathematical framework that provides the beginning of a deeper understanding of deep learning. This framework precisely characterizes the functional properties of trained neural networks. The key mathematical tools which support this framework include transform-domain sparse regularization, the Radon transform of computed tomography, and approximation theory. This framework explains the effect of weight decay regularization in neural network training, the importance of skip connections and low-rank weight matrices in network architectures, the role of sparsity in neural networks, and explains why neural networks can perform well in high-dimensional problems.
 

  • Friday, 10 November 2023, 08h00
    Friday, 10 November 2023, 17h00
  • Contact