WG5 Research Seminar - Talk by Vladimir Jacimovic and Aladin Crnkić

On June 14, 2022, Vladimir Jacimovic (University of Montenegro) and Aladin Crnkić (Technical University in Bihać) will give a talk at 3:00 pm CEST about

"Continuous-time Dynamical Systems for Infinitely Deep Learning".

The talk will be held online in Zoom. The zoom URL will be sent to all members of the WG5 and to all further colleagues who send a mail to Jan Giesselmann.

Abstract

In the past two decades neural networks experienced tremendous growth. Nowadays neural networks that consist of millions of layers are very common. This growth led to the shift in paradigm of Machine Learning and emergence of the concept, which is named Deep Learning. Typically, such huge neural networks are trained through backpropagation, using stochastic gradient descent. 
In the meantime a new approach has been proposed: one can treat the neural network as a dynamical system and increase the number of layers to infinity. This procedure turns a recurrent neural network into the system of ODE's. This system can be viewed as an infinitely deep realization of a neural network. Such an approach led to new concepts in DL, such as Normalizing Flows, Neural ODE, etc. These systems are also trained through backpropagation, but now it is a continuous-time system, known as adjoint sensitivity method.  In whole, some classical numerical methods from ODE's, PDE's and Optimal Control Theory are now used in DL with promising results on concrete problems.
In this talk I will provide a brief survey of these recent developments in DL and AI. I will also mention advantages of machine learning on non-Euclidean spaces, an observation that also brought new paradigms to the field, giving birth to so-called Geometric Deep Learning.   
In general, my talk will cover a broad area with many ideas, without going deep into any of them.
Templates title