Chen Yi Zhang
Learning capabilities of belief propagation based algorithms for sparse binary neural networks.
Rel. Luca Dall'Asta, Jean Barbier. Politecnico di Torino, Corso di laurea magistrale in Physics Of Complex Systems (Fisica Dei Sistemi Complessi), 2020
|
Preview |
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (1MB) | Preview |
Abstract
With the growth in size and complexity of modern data sets, the computational and energetic costs of training large, fully connected neural networks, became an important issue. Therefore exploring the learning capabilities of sparse (possibly binarized) architectures, that possess many less degrees of freedom but empirically appear to generalize well, is an important research direction. In this work, the learning performance of belief propagation (BP) based algorithms, applied to simple two layer sparse neural networks with discrete synapses, is analysed. Initially, the framework in which the work is carried on is the so called teacher-student scenario, in which the learning problem corresponds to the inference of the weight values of a teacher network whose architecture is given.
In this first part BP provides encouraging results, allowing to perfectly reconstruct the weights of the ‘teacher’ network that generated the training data, using only a small number of data points
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Corso di laurea
Classe di laurea
Ente in cotutela
Aziende collaboratrici
URI
![]() |
Modifica (riservato agli operatori) |
