polito.it
Politecnico di Torino (logo)

Progettazione e validazione di una metodologia di confronto per metodi di Explainable AI = Design and validation of a comparison methodology for Explainable AI techniques

Francesca Vanni

Progettazione e validazione di una metodologia di confronto per metodi di Explainable AI = Design and validation of a comparison methodology for Explainable AI techniques.

Rel. Tania Cerquitelli, Salvatore Greco. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Gestionale, 2022

[img]
Preview
PDF (Tesi_di_laurea) - Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives.

Download (13MB) | Preview
Abstract:

Nowadays, Artificial Intelligence (AI) has expanded everywhere, and people have become accustomed to the fact that AI can make decisions for us in our daily lives, ranging from product recommendations on Amazon and films on Netflix, to suggestions of friends on Facebook or Instagram, or even advertisements tailored to who is browsing web pages provided by Google. However, in decisions that can really make a difference, such as diagnosing a disease, it is important to know the motivation behind such a risky decision. Explainable Artificial Intelligence (XAI) systems are a potential solution towards accountable AI, making it trustworthy by explaining decision processes and AI logic to end users. In particular, an explanation of the algorithms allows for control in the event of unintended or undesirable outcomes, e.g. cases of social or racial discrimination. This thesis aims to make a general state of the art on the subject, dealing with what Artificial Intelligence is and the importance of explanations, and their usefulness in today's world. A general classification of the main characteristics of the most common explanation techniques is made, after which the most common ones will be listed, explaining for each one in a general way how they work and an example of how they have been validated. Finally, a global overview of the surveys in the literature comparing explanation methods is proposed, along with a general and comprehensive methodology for comparing the different explanation techniques and their testing. In this last part we have focused on explanation techniques that support textual data and we included both objective and human-based metrics and criteria.

Relatori: Tania Cerquitelli, Salvatore Greco
Anno accademico: 2021/22
Tipo di pubblicazione: Elettronica
Numero di pagine: 132
Soggetti:
Corso di laurea: Corso di laurea magistrale in Ingegneria Gestionale
Classe di laurea: Nuovo ordinamento > Laurea magistrale > LM-31 - INGEGNERIA GESTIONALE
Aziende collaboratrici: NON SPECIFICATO
URI: http://webthesis.biblio.polito.it/id/eprint/22511
Modifica (riservato agli operatori) Modifica (riservato agli operatori)