polito.it
Politecnico di Torino (logo)

Evaluating AutoML-Driven Estimators for State Estimation and the Application of Explainable AI

Alessio Carachino

Evaluating AutoML-Driven Estimators for State Estimation and the Application of Explainable AI.

Rel. Edoardo Patti, Alessandro Aliberti. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering), 2024

Abstract:

State estimation is a fundamental task in the operation of smart grids, guaranteeing that the system can consistently be monitored and forecast grid conditions for effective administration. Although conventional techniques for state estimation are efficient, they encounter difficulties in adjusting to progressively intricate grid contexts. The goals of this Thesis include assessing whether AutoML tools can outperform traditional methods, paving the way for future development of automated state estimation solutions in smart grids. In order to assess the efficacy of AutoML models in this task, synthetic data was produced using the pandapower tool on two grids of varying sizes to replicate diverse operating scenarios. The choice of AutoML frameworks to use has been restricted to the open-source ones, leading to AutoGluon, H2O, and AutoSklearn. These frameworks automate common stages of the machine learning pipeline, including the selection, tweaking, and validation of machine learning models. These frameworks were evaluated against conventional estimate methods, such as those utilizing a multilayer perceptron, to ascertain their comparative effectiveness in terms of different factors such as training time and quality of estimations. The estimators implemented using AutoML have shown exceptional performance compared to conventional approaches in various grid conditions, emphasizing the feasibility of AutoML in improving the adaptability and resilience of grid monitoring systems, encouraging the future development of a solution based on AutoML-driven estimators, to be integrated into power systems. In addition, this Thesis conducted a preliminary investigation into the role of explainable AI (XAI) in state estimation. Three XAI tools, SHAP, LIME, and LRP, were described and tested to determine the kinds of explanations they can provide for both traditional and AutoML-based estimators, focusing on the factors the tools depend on to generate explanations and how model predictions could be interpreted in the context of smart grid operations. While AutoML models achieved better performance, the applicability of XAI tools remains less conclusive. Despite their potential to enhance transparency and trust in the employed models, further research is needed to confirm their utility in operational settings. This research demonstrates that AutoML-based estimators are a promising alternative to traditional state estimation approaches, offering improved accuracy and automated model development. On the other hand, although the integration of explainable AI (XAI) shows potential for improving model interpretability, its practical implementation for state estimation in smart grids requires further exploration. Finally, this study lays the groundwork for future research in combining AutoML and XAI to create smarter, more interpretable state estimation solutions for the evolving needs of grid management.

Relatori: Edoardo Patti, Alessandro Aliberti
Anno accademico: 2024/25
Tipo di pubblicazione: Elettronica
Numero di pagine: 92
Informazioni aggiuntive: Tesi secretata. Fulltext non presente
Soggetti:
Corso di laurea: Corso di laurea magistrale in Ingegneria Informatica (Computer Engineering)
Classe di laurea: Nuovo ordinamento > Laurea magistrale > LM-32 - INGEGNERIA INFORMATICA
Aziende collaboratrici: NON SPECIFICATO
URI: http://webthesis.biblio.polito.it/id/eprint/33145
Modifica (riservato agli operatori) Modifica (riservato agli operatori)