Lorenzo Scarciglia
Visual Servoing for Autonomous Nano-drones Racing.
Rel. Daniele Jahier Pagliari, Daniele Palossi, Alessio Burrello, Matteo Risso. Politecnico di Torino, Corso di laurea magistrale in Data Science And Engineering, 2023
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (13MB) | Preview |
Abstract: |
Drones are nowadays used in many scenarios, from air surveillance and search-and-rescue missions to agriculture and cinematography. Drones' miniaturisation led to light nano-drones that fit the palm of a hand. Despite some downsides like the battery life of the order of minutes and limited computational power of the onboard microcontroller unit (MCU) belonging to the sub-100 mW power envelope, such systems can navigate in narrow spaces and being harmless around humans can be used in many scenarios. Over the last decade, autonomous drone racing (ADR) competitions have fostered research and provided an opportunity for scientists to create cutting-edge perception and control algorithms meant to operate directly onboard the drones. In such competitions, drones have to pass through a predefined set of gates as fast as possible while avoiding obstacles without human intervention. Recently, nano-drones faced ADR competitions. In this thesis, our focus centres on the Crazyflie 2.1, a nano-drone with only a 10 cm diameter and a weight of just 27g, within the context of a drone racing scenario featuring square gates. Such a drone is equipped with an ultra-low-power monochrome camera, sensors to estimate the drone's state, and the GAP8 MCU, which enables the execution of deep learning workloads directly onboard. To solve the gate-based navigation, i.e., the task of identifying and crossing the gate, we employ the image-based visual servoing (IBVS), a vision-based control aimed at aligning extracted image features with predefined objectives by issuing velocity commands to the drone. We develop and compare two detection modules that extract the features required by IBVS, in the gate-based navigation case, the four corners of the gate. One consists of traditional computer vision (CV) algorithms, while the other relies on deep learning (DL) exploiting a convolutional neural network. Webots, a robotic simulator, is used to gather synthetic images to tune the CV module and to train the DL one. The synthetic training dataset is collected by randomly spawning the drone around the gate, taking care that each corner falls inside the image, and varying the background during the collection. Two testing datasets of increasing difficulty are collected in the same fashion without the background variation. Data augmentation is used to increase the number of training images and to mimic the real images. Overall, detection modules are tested on the two synthetic test sets and one of the actual camera images. The DL module error is 1.8x up to 4.3x lower than the CV one. To jointly test a detection module and the IBVS, we exploit Webots and define a flight task consisting of take-off, gate-based navigation, and landing. The same task is carried on in three worlds of increasing difficulty. The background increases in difficulty while the drone and relative position of the gate are the same in all the worlds. While the CV module succeeds only in the simplest of the worlds with a mean completion time of 84 seconds, the DL module outperforms it by completing it in 64 seconds. Moreover, it completes the task in all the worlds, showing the generalisation ability of the trained network. |
---|---|
Relatori: | Daniele Jahier Pagliari, Daniele Palossi, Alessio Burrello, Matteo Risso |
Anno accademico: | 2023/24 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 97 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Data Science And Engineering |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA |
Aziende collaboratrici: | IDSIA / SUPSI |
URI: | http://webthesis.biblio.polito.it/id/eprint/28449 |
Modifica (riservato agli operatori) |