Podcast: Play in new window | Embed
Subscribe: Apple Podcasts | Google Podcasts | Spotify | Amazon Music | Email | TuneIn | Deezer | RSS
In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Recurrent Neural Networks (RNN) and Long-Short Term Memory (LSTM), explain how these terms relate to AI and why it’s important to know about them.
Want to dive deeper into an understanding of artificial intelligence, machine learning, or big data concepts? Want to learn how to apply AI and data using hands-on approaches and the latest technologies? Check out these hand-selected books in our Suggested Reading List that can help you expand your knowledge or put your knowledge to use.
Show Notes:
- FREE Intro to CPMAI mini course
- CPMAI Training and Certification
- Suggested Reading List
- AI Glossary
- Glossary Series: Natural Language Processing (NLP), NLU, NLG, Speech-to-Text, TTS, Speech Recognition
- Glossary Series: Recognition Systems, Computer Vision, ImageNet
- Glossary Series: Training Data, Epoch, Batch, Learning Curve
- Glossary Series: (Artificial) Neural Networks, Node (Neuron), Layer
- Glossary Series: Bias, Weight, Activation Function, Convergence, ReLU
- Glossary Series: Perceptron
- Glossary Series: Hidden Layer, Deep Learning
- Glossary Series: Loss Function, Cost Function & Gradient Descent
- Glossary Series: Backpropagation, Learning Rate, Optimizer
- Glossary Series: Feed-Forward Neural Network