Fabio Eterno
Differentiable Neural Architecture Search Algorithms for TinyML benchmarks.
Rel. Daniele Jahier Pagliari, Alessio Burrello, Matteo Risso. Politecnico di Torino, Corso di laurea magistrale in Data Science And Engineering, 2022
|
Preview |
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (13MB) | Preview |
Abstract
Nowadays, Artificial Intelligence (AI), especially in the form of Machine Learning (ML) and Deep Learning (DL), is becoming the go-to approach to solve complex problems in several sectors such as Computer Vision (CV), Speech Recognition, Natural Language Processing (NLP) and many others. Despite the huge effort spent by public and private actors to reach state-of-the-art results, the design of Deep Neural Networks (DNNs) is still a manual process heavily based on empirical rules and heuristics, thus requiring designers with strong expertise. This inspired researchers to define a new set of algorithms called Neural Architecture Search (NAS). NAS algorithms are becoming very popular in the TinyML/TinyDL domain where the choice of the specific ML/DL model structure is of primal importance.
Indeed, deploying DNNs on tiny devices (e.g., small microcontrollers, IoT nodes, etc.) requires considering not only the final accuracy reached by the model, but also the hardware constraints in terms of memory footprint, latency, and energy consumption related to the target device
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Corso di laurea
Classe di laurea
Aziende collaboratrici
URI
![]() |
Modifica (riservato agli operatori) |
