Giulia Botta
Social Robot as a Support for Sign Language Communication.
Rel. Pangcheng David Cen Cheng, Cristina Gena. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2024
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (3MB) | Preview |
Abstract: |
Robots have progressively become more and more pervasive in our everyday lives, for example they help us on some activities like cleaning, working and also entertaining. Furthermore, they can help us to communicate in different languages, by automatic translation, text to speech translation, etc. The focus of this thesis is the study of using a social robot as an interpreter of sign language, specifically the Italian Sign Language (LIS). The robot considered in this work is Pepper, developed by Aldebaran. Pepper is a social robot, which means that it has sensors that forbid dangerous movements. Also, it is a humanoid robot, so it resembles the human appearance. Pepper is similar to us but has some limitations, for example it cannot control separately its fingers and it has a built-in tablet on the chest that limits the arms movements. Despite these problems, Pepper is ideal for the research since it is a commercial robot and some studies have shown that children love it. The thesis is divided into 2 main steps: the first step is to implement the signs through the Animation Editor, a tool of the Pepper SDK for Android Studio, fitting the joint values to each movement of each sign; the second step consists in writing a script to automate the creation of the signs. The objective is to create a new sign from 3D coordinates without manually changing the joint values. The script is composed of 2 parts: a MATLAB script that computes the joint angles for each movement, from the given coordinates and a Python script that is like a wrapper, it calls the MATLAB script and writes the file containing the details of the sign. The resulting animation file has an XML format that can be opened in the Animation Editor. The signs to be implemented were chosen by narrowing down the configurations to those physically realizable by the robot. These signs were identified with the help of a human sign interpreter. Some of these signs are linked to the education, the others are verbs and nouns related to different topics. The total number of implemented signs is up to 52 and, even though this number would surely be higher if a different robot without the limitations mentioned was used, it allowed us to draw some conclusions on the possibility of using such robots to interact with impaired people. A survey, containing a set of signs has been submitted to some deaf people, the results showed that the signs, even if they are not as smooth as human's signs, are understandable and that Pepper can be used to communicate simple concepts in sign language. |
---|---|
Relatori: | Pangcheng David Cen Cheng, Cristina Gena |
Anno accademico: | 2024/25 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 45 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA |
Aziende collaboratrici: | UNIVERSITA' DEGLI STUDI DI TORINO |
URI: | http://webthesis.biblio.polito.it/id/eprint/33969 |
Modifica (riservato agli operatori) |