Podcast: Play in new window | Embed
Subscribe: Apple Podcasts | Google Podcasts | Spotify | Amazon Music | Email | TuneIn | Deezer | RSS
Sometimes for reasons such as improving performance or robustness it makes sense to create multiple decision trees and average the results to solve problems related to overfitting. Or, it makes sense to boost certain decision trees. In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Random Forest and Boosted Trees, and explain how they relate to AI and why it’s important to know about them.
Show Notes:
- FREE Intro to CPMAI mini course
- CPMAI Training and Certification
- AI Glossary
- Glossary Series: Artificial Intelligence
- AI Glossary Series – Machine Learning, Algorithm, Model
- Glossary Series: Prediction, Inference, and Generalization
- Glossary Series: Overfitting, Underfitting, Bias, Variance, Bias/Variance Tradeoff
- AI Glossary Series: Ensemble Models
- AI Glossary Series: Decision Trees
- Glossary Series: Machine Learning Approaches: Supervised Learning, Unsupervised Learning, Reinforcement Learning
- Glossary Series: Classification & Classifier, Binary Classifier, Multiclass Classifier, Decision Boundary
- Glossary Series: Regression, Linear Regression