Samuele Battaglino
Generalized Principal Component Analysis Theory.
Rel. Elena Maria Baralis, Erdem Koyuncu. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2019
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (2MB) | Preview |
Abstract: |
In the large field of Principal Component Analysis (PCA), a fairly old technique, there has been lately some advancement that left open a lot of room for improvements. We propose a way to generalize completely the PCA standard and Kernel version (KPCA) from using only the norm in their objective function. Instead a generic one can be used to better fit different datasets and applications and to ultimately outperform the usual norm based PCA. In the literature several attempts with Lp-norms have resulted in pretty good advancement thus using other functions seems to be the logical following. We provide here the mathematical models behind our working algorithms and testing results with different functions on a really large variety of datasets. In most ofthe cases our method proved to be more robust to both outliers and noise than the state-of-the art L1-norm PCA and L1-norm KPCA and more malleable than any other adaptation yet. Lastly, we proved the local optimality of the majority of our work even with a neat comparison between KPCA and a Recurrent Neural Network (RNN). |
---|---|
Relatori: | Elena Maria Baralis, Erdem Koyuncu |
Anno accademico: | 2019/20 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 61 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering) |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA |
Ente in cotutela: | UNIVERSITY OF ILLINOIS AT CHICAGO (STATI UNITI D'AMERICA) |
Aziende collaboratrici: | NON SPECIFICATO |
URI: | http://webthesis.biblio.polito.it/id/eprint/13126 |
Modifica (riservato agli operatori) |