Home
Scholarly Works
Transformer-Based Deep Learning Strategies for...
Journal article

Transformer-Based Deep Learning Strategies for Lithium-Ion Batteries SOX Estimation Using Regular and Inverted Embedding

Abstract

The accurate estimation of Li-ion battery (LIB) states such as State of Charge (SOC), State of Health (SOH), and State of Power (SOP) plays a pivotal role in the efficient operation of Electric Vehicles (EVs). These parameters can impact the battery’s health, driving range, and overall vehicle performance. Transformer-based artificial neural networks have shown impressive results in natural language processing (NLP) and estimation problems of many other domains. This paper presents an intensive study on the capabilities of various Transformer-based models in estimating the SOC and SOH of LIBs, the SOP is obtained based on the estimated SOC. This paper provides the following key original contributions: 1) the application of the Informer and Reformer variants of the Transformer model for the first time for SOH estimation of LIBs in EVs, 2) studying the effect of inverted embedding of iTransformers, a modified architecture of the transformers, on SOC and SOH estimation, inversion is performed on the Informer and Reformer as well; 3) applying a simple feature extraction method using partial discharge cycles for SOH estimation with Transformer-based models; 4) a new robust method is proposed for SOC estimation based on a 2-Encoder-Transformer with a one-dimensional convolutional neural network (1D-CNN) architecture; 5) the various architectures are trained, validated and tested on two real-world datasets comprising various driving scenarios and battery conditions. Comparative analysis with various deep learning architectures show impressive accuracy for estimating the SOC and SOH, leading to better SOP calculation.

Authors

Guirguis J; Abdulmaksoud A; Ismail M; Kollmeyer PJ; Ahmed R

Journal

IEEE Access, Vol. 12, , pp. 167108–167119

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

January 1, 2024

DOI

10.1109/access.2024.3495560

ISSN

2169-3536

Contact the Experts team