Francesco Dilevrano
Serial bit accelerator with sparsity managment.
Rel. Maurizio Martina, Guido Masera. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering), 2023
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (5MB) | Preview |
Abstract: |
This thesis work starts with an explanation of what are neural networks and the reasons for their big exploitation in recent years. Then there is a focus on particular architecture choices that make some NN faster and more efficient. Particular attention should be paid to the compression method for sparsity. Starting from a previously developed engine some modifications have been applied to it in order that the engine able to skip useless operations and so improve the throughput. The main modifications are the introduction of a filtering system to manage compression methods and some modifications to adapt the Multiply and Accumulation process. |
---|---|
Relatori: | Maurizio Martina, Guido Masera |
Anno accademico: | 2023/24 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 65 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-29 - INGEGNERIA ELETTRONICA |
Aziende collaboratrici: | NON SPECIFICATO |
URI: | http://webthesis.biblio.polito.it/id/eprint/28524 |
Modifica (riservato agli operatori) |