Benedetta Sabbadini
Capturing Gait Signature: A Preliminary Study on the Feasibility of Recognizing People by their Gait Based on a Biomechanically-driven Marker-less Approach Using Multiple RGB Cameras.
Rel. Andrea Cereatti, Diletta Balta, Paolo Tasca. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Biomedica, 2024
|
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (3MB) | Preview |
Abstract: |
Gait recognition based on video data has gained popularity in both surveillance and clinical contexts due to its unique ability to identify individuals based on walking patterns. Unlike other biometrics like fingerprints or facial recognition, gait works effectively from a distance, does not need user cooperation, handles low-quality videos, remains reliable when body traits are hidden, and is hard to imitate. Gait recognition uses either model-free or model-based methods. Model-free approaches rely on image-based features, while model-based methods focus on biomechanical features like joint kinematics. Model-based methods are easier to understand, view- and scale-invariant, and less affected by background noise, but they have greater computational complexity. Identifying the most relevant gait features is essential to reduce this load. This work proposes a model-based, marker-less method for gait recognition driven by biomechanical features. To reduce computational complexity, features were selected automatically from a wide range of parameters from the literature, covering various domains and modalities. The study involved fifteen healthy subjects walking at three different speeds along a 5m indoor walkway, recorded by three RGB cameras. Joint centres were tracked semi-automatically using MoveNet, a deep-learning pose estimator. Stride segmentation was automatically performed to extract gait features stride-by-stride, including both time-domain and frequency-domain features. Features were ranked using the minimum redundancy maximum relevance algorithm, and those scoring below 0.001 were excluded, resulting in 19 features, including kinematic, frequency parameters, and correlation indices. Then, top-down wrapper feature selection was applied using seven classification models: (i) Decision Tree, (ii) Discriminant Analysis, (iii) Ensemble Classifier, (iv) ECOC, (v) k-NN, (vi) Naïve Bayes, and (vii) Neural Networks. Classification accuracy was assessed via 5-fold cross-validation, and the best feature set was chosen for each model. Hyperparameters were tuned by grid-search and 5-fold cross-validation, and the best-performing configuration was selected for each model. Finally, models were evaluated on unseen data, with recall, precision, and F1-score metrics calculated for the entire dataset and each walking speed. The ECOC model achieved the highest F1-score (78.71%), followed by k-NN (73.40%), while Decision Tree performed poorly (39.86%) due to overfitting, sensitivity to small changes, and reliance on single-feature splits, limiting its ability to capture complex gait patterns. Other models' F1-scores were 69.58% (Discriminant Analysis), 64.15% (Ensemble Classifier), 62.08% (Naïve Bayes), and 66.19% (Neural Networks). Although this method performed lower than previous studies, likely due to dataset homogeneity, the identified feature subsets provide a strong basis for discriminating individuals from their gait. Performance could improve with a larger, more diverse dataset and varied gait patterns. This study contributes to the development of biomechanically driven gait recognition with potential applications in clinical assessment and video surveillance. |
---|---|
Relatori: | Andrea Cereatti, Diletta Balta, Paolo Tasca |
Anno accademico: | 2024/25 |
Tipo di pubblicazione: | Elettronica |
Numero di pagine: | 111 |
Soggetti: | |
Corso di laurea: | Corso di laurea magistrale in Ingegneria Biomedica |
Classe di laurea: | Nuovo ordinamento > Laurea magistrale > LM-21 - INGEGNERIA BIOMEDICA |
Aziende collaboratrici: | Politecnico di Torino |
URI: | http://webthesis.biblio.polito.it/id/eprint/33358 |
Modifica (riservato agli operatori) |