Matteo Raviola
Training Kernel Neural ODEs with optimal control and Riemannian optimization.
Rel. Claudio Canuto, Fabio Nobile. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Matematica, 2022
|
Preview |
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (13MB) | Preview |
Abstract
Nowadays, Machine Learning pipelines permeate the scientific computing world. The flexibility of Neural Networks makes them a formidable tool to perform numerous kinds of tasks, however their training keeps proving to be a computationally challenging optimization problem. This thesis focuses on a specific kind of Neural ODEs, Kernel Neural ODEs (KerODEs), where the usual parametric non-linearities are replaced by elements of a reproducing kernel Hilbert space (RKHS) fixed a priori. Classical training algorithms are based on a variant of stochastic gradient descent, coupled with the celebrated backpropagation algorithm for gradient computations. Though extremely versatile, these approaches potentially suffer from long computational times and/or high cost per iteration.
We propose and numerically explore methodologies to overcome both of these issues for the optimization of KerODE parameters in the context of a regression task
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Corso di laurea
Classe di laurea
Ente in cotutela
Aziende collaboratrici
URI
![]() |
Modifica (riservato agli operatori) |
