Livia Vico
Detecting human gestures through event camera data using deep neural networks.
Rel. Laura Gastaldi, Michele Polito, Pedro Neto, Laura Sofia Ferreira Duarte. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Biomedica, 2024
Abstract
Human action recognition has diverse applications, particularly in environments where humans and robots interact and collaborate. In any human-robot collaboration (HRC) scenario, a robot’s ability to understand human actions and intentions within the environment is crucial for both operational effectiveness and agent’s safety. Robot situational awareness leverages various innovative technologies. Nowadays, event cameras are a key element of computer vision and represent a promising technology for enabling robots to recognize human actions. These sensors capture movement by detecting changes in pixel brightness asynchronously, providing a higher temporal resolution than traditional cameras and minimizing data redundancy. The event data output can be converted into a frame-like representation, allowing the application of proven frame-based action recognition algorithms.
Among these, deep neural networks stand out, leveraging their multi-layer learning capability to identify complex patterns in data
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Informazioni aggiuntive
Corso di laurea
Classe di laurea
Ente in cotutela
Aziende collaboratrici
URI
![]() |
Modifica (riservato agli operatori) |
