
Luigi Maggipinto
Humanoid Robots for Visual Distraction In-Vehicle Test Automation.
Rel. Riccardo Coppola. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2025
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (56MB) | Preview |
Abstract: |
Road traffic safety has become a critical area of research due to the increasing number of accidents caused by driver distraction. In order to address this issue, the European Union has introduced Regulation (EU) 2019/2144, which requires the integration of Advanced Driver Distraction Warning Systems (ADDWS) into all newly manufactured vehicles by 2026. ADDWS operates within Driver Monitoring Systems (DMS) to detect driver distraction using visual and sensor-based monitoring techniques. Ensuring the effectiveness of these systems requires rigorous validation under controlled conditions. This research proposes an automated framework for validating the visual distraction of ADDWS by utilizing a humanoid robotic platform, Ameca Desktop, as a synthetic driver. In contradistinction to traditional human-subject testing, this approach eliminates variability, thereby providing a reproducible ground truth for the detection of distraction. The system consists of a multi-node architecture where data is collected from Ameca's motor positions, an Intel RealSense camera and a Time-of-Flight (ToF) sensor. The Raspberry Pi units coordinate data acquisition through a Flask-based API framework, ensuring synchronised recording of motor angles, depth data, and video frames. A fundamental component of the methodology involves the analysis of Ameca's head and eye movements to classify distraction states. The collected data is then processed to determine whether the humanoid exceeds predefined motion thresholds, thus categorising its state as either distracted or not distracted. To validate the robustness of the system, a System Under-Test (SUT) approach is introduced, employing MediaPipe to estimate head pose (yaw, pitch, roll) and eye gaze direction under identical test conditions. This enables a comparative evaluation between the robotic ground truth and computer vision-based distraction detection. The proposed framework establishes a scalable, automated testing solution for ADDWS evaluation by leveraging Ameca's precise motor control and high-fidelity facial expressions. Furthermore, the system is designed to ensure that distraction detection methodologies comply not only with Regulation (EU) 2019/2144 but also with the performance assessment criteria established by Euro NCAP. By aligning with these regulatory and safety standards, this research contributes to the advancement of reliable DMS validation methodologies, ensuring the robustness of ADDWS before their deployment in consumer vehicles. |
---|---|
Relatori: | Riccardo Coppola |
Anno accademico: | 2024/25 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 93 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA |
Aziende collaboratrici: | SANTER Reply S.p.a. |
URI: | http://webthesis.biblio.polito.it/id/eprint/35341 |
![]() |
Modifica (riservato agli operatori) |