Giulia D'Ascenzi
Transformer-based pre-trained models for Out-Of-Distribution detection.
Rel. Tatiana Tommasi, Francesco Cappio Borlino. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2023
Abstract
Standard closed-set deep learning approaches deployed in the real-world fail when test samples come from never-seen data distributions, which makes them untrustworthy for safety-critical applications such as autonomous driving or healthcare. A preferable behavior would be to raise an alert in case of unknown object categories, a task known as Out-Of-Distribution (OOD) detection. The common strategy to handle this task is to train (or at least fine-tune) the detector on the target data to make it learn the normality distribution, to then recognize test samples not belonging to it. Consequently, It would be necessary to gather a great amount of labeled target data and to perform the training procedure for each distinct downstream task.
Those are restrictive characteristics for many real-world applications, due to data privacy rules, strict memory, and computational constraints (e.g edge computing)
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Informazioni aggiuntive
Corso di laurea
Classe di laurea
URI
![]() |
Modifica (riservato agli operatori) |
