Experts has a new look! Let us know what you think of the updates.

Provide feedback
Home
Scholarly Works
E-LSTM: An extension to the LSTM architecture for...
Conference

E-LSTM: An extension to the LSTM architecture for incorporating long lag dependencies

Abstract

The Long Short-Term Memory (LSTM) architecture is one of the most successful types of Recurrent Neural Networks (RNNs). However, the number of parameters that LSTMs need to achieve acceptable performance might be larger than desired for standard devices. In this work, an Extended LSTM (E-LSTM) architecture is proposed to reduce the number of parameters needed to achieve similar performance to LSTMs. The architecture of the proposed E-LSTM is …

Authors

Martinez-Garcia F; Down D

Volume

00

Pagination

pp. 1-8

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

July 23, 2022

DOI

10.1109/ijcnn55064.2022.9892810

Name of conference

2022 International Joint Conference on Neural Networks (IJCNN)