Francesco Dilevrano
Serial bit accelerator with sparsity managment.
Rel. Maurizio Martina, Guido Masera. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering), 2023
|
Preview |
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (5MB) | Preview |
Abstract
This thesis work starts with an explanation of what are neural networks and the reasons for their big exploitation in recent years. Then there is a focus on particular architecture choices that make some NN faster and more efficient. Particular attention should be paid to the compression method for sparsity. Starting from a previously developed engine some modifications have been applied to it in order that the engine able to skip useless operations and so improve the throughput. The main modifications are the introduction of a filtering system to manage compression methods and some modifications to adapt the Multiply and Accumulation process.
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Corso di laurea
Classe di laurea
URI
![]() |
Modifica (riservato agli operatori) |
