Alessandro Marchei
Neuromorphic vision for autonomous flight and landing on a winged drone.
Rel. Marcello Chiaberge. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering), 2024
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (9MB) | Preview |
Abstract: |
Fixed-wing drones, known for their superior energy efficiency and speed, are increasingly used in applications such as environmental monitoring and long-range missions. While they offer advantages over quadrotors in terms of distance and endurance, they remain underdeveloped in research due to the challenges they present. Precision in tasks like landing and altitude estimation relies heavily on accurate motion sensing and control, typically achieved through optical flow (OF) estimation. However, at high speeds, conventional cameras struggle with issues like motion blur and limited dynamic range, which can compromise the quality of OF data. To overcome these limitations, this work explores the use of neuromorphic cameras, which mimic biological vision by responding to brightness changes at the pixel level. These sensors provide high temporal resolution and dynamic range, making them well-suited for high-speed applications. Nevertheless, the high velocities typical of fixed-wing drones generate an overwhelming number of events, placing significant demands on embedded platforms and complicating real-time processing. The core of this research evaluates various optical flow algorithms, including both model-based and learning-based approaches, to identify the best candidate for real-time processing in event-based vision. After a comprehensive review of state-of-the-art techniques, the sparse Lucas-Kanade method was chosen for its balance of accuracy and computational efficiency. However, the method required substantial optimization to meet the specific challenges posed by real-time event-based data processing. The thesis details the full workflow, from processing raw event streams to generating low-level actuator commands, with emphasis on key steps such as optical flow computation, derotation, adaptive slicing, and altitude estimation. These processes were optimized to handle the high event rates produced by neuromorphic cameras, ensuring that the algorithm could operate efficiently on resource-constrained systems. The final implementation was tested in outdoor environments, under a variety of conditions including full daylight, low-light scenarios, and varying event time resolutions. The system demonstrated remarkable robustness in estimating altitude and executing precise landing maneuvers, even in challenging conditions. The outcomes of this research suggest that the algorithm is highly generalizable and could be adapted for other drone tasks, such as obstacle avoidance and visual-inertial odometry, even on hardware with limited processing power, advancing the state-of-the-art in neuromorphic vision and robotics. |
---|---|
Relatori: | Marcello Chiaberge |
Anno accademico: | 2024/25 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 124 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-29 - INGEGNERIA ELETTRONICA |
Ente in cotutela: | EPFL (SVIZZERA) |
Aziende collaboratrici: | NON SPECIFICATO |
URI: | http://webthesis.biblio.polito.it/id/eprint/33163 |
Modifica (riservato agli operatori) |