Luca Crupi
Integration of Deep-learning-powered Drone-to-human Pose Estimation on Ultra-low-power Autonomous Flying Nano-drones.
Rel. Daniele Jahier Pagliari, Daniele Palossi, Christian Pilato. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2022
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (17MB) | Preview |
Abstract: |
Nano-drones are capable of performing a vast amount of tasks that are not doable in any other, comparably versatile, way, including indoor surveillance, search and rescue, and inspection. Their reduced size and cost, as well as their limitations in terms of power for computational purposes (under 100 mW) make running deep learning models on these devices particularly challenging. The aim of this work is to study automated ways to design and deploy sufficiently tiny Neural Network (NN) architectures, reducing the number of parameters and operations of an input seed network. In order to address this task we employed a novel Network Architecture Search (NAS) technique called, Pruning In Time (PIT). PIT was previously designed and tested on 1D networks and TCNs, but with this thesis we extended its use to 2D models and demonstrated its capabilities on a Drone-to-human pose estimation task on the Crazyflie 2.1 nano drone where, the prediction variables are x, y, z and phi (angle of rotation around z). The main interest for this NAS is due to the fact that its search time is approximately equal to the time of training the actual network. Since the GAP8 System-on-Chip (SoC), mounted onboard the Crazyflie, has only 512 KB of L2 memory, model size reduction was crucial in order to avoid an increase of the latency due to accesses in the off-chip DRAM memory. The architectures obtained from the NAS, that belong to the cycles versus accuracy Pareto front, have been carefully tested on testset images and in on-field experiments. Starting from two different seeds, FrontNet and MobileNet v1, we were able to obtain up to 5.6x size reduction that helped in providing up to 50% faster inference. Performance improved by 5% with respect to FrontNet, evaluated through the Mean Absolute Error (MAE) of the distance between predictions and ground truth relative pose of the x, y, z and phi variables. Training and testing images and corresponding labels were acquired in the Manno (CH) laboratory and used for the selection of the various networks as well as a first attempt performance analysis. In field tests instead were performed in a never seen before environment, namely, the Lugano (CH) laboratory. Our NAS technology was able to produce networks that perform up to 48% better with respect to FrontNet in the new environment, in terms of MAE on the most challenging variable to predict (phi). In the on-field experiments the control performance of the drone improved by 32% in terms of absolute distance and 24% for what concerns the yaw control angle. It is worth noting that the path walked for testing, in the never seen before environment, was completed only by the architectures obtained by PIT starting from a MobileNet seed. FrontNet seed and derived models were not able to conclude the path and, on a three experiments average, they completed at most 85% of it. In summary, the thesis demonstrates that efficient NAS techniques can be successfully employed to optimize deep learning models on constrained robotic platforms, reducing the size and complexity of networks while simultaneously improving their predictive performance. |
---|---|
Relatori: | Daniele Jahier Pagliari, Daniele Palossi, Christian Pilato |
Anno accademico: | 2022/23 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 110 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA |
Ente in cotutela: | Idsia SUPSI-USI (SVIZZERA) |
Aziende collaboratrici: | SUPSI |
URI: | http://webthesis.biblio.polito.it/id/eprint/24546 |
Modifica (riservato agli operatori) |