Politecnico di Torino (logo)

Enhancing Structural Health Monitoring through Self-supervised learning: An application of Masked Autoencoders on Anomaly Detection and Traffic Load Estimation.

Yhorman Alexander Bedoya Velez

Enhancing Structural Health Monitoring through Self-supervised learning: An application of Masked Autoencoders on Anomaly Detection and Traffic Load Estimation.

Rel. Daniele Jahier Pagliari. Politecnico di Torino, Corso di laurea magistrale in Data Science and Engineering, 2023

[img] PDF (Tesi_di_laurea) - Tesi
Restricted to: Repository staff only until 21 April 2024 (embargo date).
Licenza: Creative Commons Attribution Non-commercial No Derivatives.

Download (3MB)

Structural Health Monitoring (SHM) is an increasingly important field due to the rising number and complexity of structures such as bridges, buildings, and viaducts. The need to improve user safety and ensure the optimal functionality of the structures has motivated the improvement of SHM systems from sporadic human evaluations to continuous monitoring through various types of sensors capable of streaming high-frequency data The increasing amount of SHM data has encouraged the development of algorithms and methodologies that support structural monitoring and decision-making processes in order to extend the structure's lifespan, reduce maintenance costs, and ensure the user’s safety. Anomaly detection and Traffic Load Estimation (TLE) are two key activities in the field of SHM. Anomaly detection strategies are used to identify when a structure is not functioning as expected, indicating damage in the early stages, optimizing maintenance activities, and increasing the reliability of the structures. TLE is useful to estimate the usage level of a structure at a given time. An effective TLE pipeline helps to understand patterns over time about the load of the structure, which can be useful to identify possible overload situations or to improve maintenance activities schedules. The aim of this thesis is to study the use of Self-Supervised Transformer-Based Masked Autoencoders on data generated by SHM systems to address the tasks of anomaly detection and TLE. Additionally, we aim to demonstrate that a fine-tuning phase on a pre-trained model leads to better performance on supervised tasks than training the supervised models from scratch. Our methodology is divided into two steps for each task. First, a general pre-training phase is common for both applications, in which the signal generated by SHM accelerometers is transformed into PSD spectrograms, The masked autoencoder is then used in a self-supervised reconstruction task to learn latent representations and patterns from the spectrograms. In the second step, the pre-trained autoencoder is used on the specific applications. For anomaly detection, we propose a methodology based on the reconstruction error to identify anomalies. For TLE, we propose a fine-tuning phase in which the pre-trained autoencoder is further trained using a smaller labeled dataset to predict the number of vehicles crossing the viaduct at a given time based on the signal disturbances. The results of our experiments demonstrate that we achieve an accuracy of 98.8% in the anomaly detection tasks with a 1-minute delay in detecting the anomalies. This result is comparable to the state-of-art techniques based on the PCA algorithm trained on the same dataset, which achieves an accuracy of 98.8%, but with a 1-hour delay in detecting anomalies. For TLE, we achieve a Mean Absolute Error (MAE) and Mean Absolute Percentage Error (MAPE) of 0.4/26%. We also demonstrate that a pre-training phase improves the model's performance by 17%.

Relators: Daniele Jahier Pagliari
Academic year: 2022/23
Publication type: Electronic
Number of Pages: 81
Corso di laurea: Corso di laurea magistrale in Data Science and Engineering
Classe di laurea: New organization > Master science > LM-32 - COMPUTER SYSTEMS ENGINEERING
Aziende collaboratrici: UNSPECIFIED
URI: http://webthesis.biblio.polito.it/id/eprint/26715
Modify record (reserved for operators) Modify record (reserved for operators)