A type of deep learning artificial neural network which uses time series or sequential data to perform a machine learning task. RNNs have loops or cycles that allow outputs from nodes to impact other layers that can provide input to those same nodes. This allows for memory or the ability to handle sequential data. RNNs can use their internal state to process variable length sequences of inputs. RNNs are commonly used in speech recognition and natural language processing. Long-Short Term Memory (LSTM) deep learning neural networks are a form of RNN. Transformer architecture models have shown better performance than RNN at many tasks RNNs have traditionally been used for including NLP activities. Another term for RNN is Feedback Neural Network (FNN).