Search
Close this search box.

Cognilytica is now part of PMI!

Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms

Long Short-Term Memory (LSTM)

A recurrent neural network (RNN) that remembers the previous information presented and uses that information for processing the current input. As an RNN, LSTM systems have feedback loops in which outputs from neurons at certain layers are looped back to provide inputs to those same layers, and so is particularly good at processing data that occurs in sequences or time series such as natural language or video. While a standard RNN has both long-term memory, since weights and biases in loops record sequences, and short-term memory since activation patterns indicate what has recently happened, LSTMs enhance RNNs by providing short-term memory for RNN that can last thousands of timesteps, so a long form of short-term memory. An LSTM unit is most often composed of a cell, input gate, output gate, and “forget” gate. The cell remembers the values, while the other gates control the flow of information. LSTMs are particularly good for handwriting recognition, speech recognition, machine translation, video annotation and control, robotic systems, and healthcare applications. The development of Transformer architecture networks have provided an alternative to LSTM approaches, which were the most-used networks for sequential type data.

Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects

cropped-CogHeadLogo.png

Register to View Event

cropped-CogHeadLogo.png

Get The Long Short-Term Memory (LSTM)

cropped-CogHeadLogo.png

AI Best Practices

Get the Step By Step Checklist for AI Projects

login

Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!

LIMITED TIME OFFER: Get $1,000 OFF ANY CPMAI ENROLLMENT! 🕐