Search
Close this search box.

Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms

Hyperparameter Tuning

The process of adjusting the hyperparameters to improve a model’s overall fit, performance, accuracy, and other measures. Approaches to hyperparameter tuning include:

  • Grid Search: Iteratively trying a bunch of pre-defined hyperparameters for the best performing hyperparameter. Brute-force search.
  • Random Search: Doing grid search, but randomly and not as exhaustively. Faster and can outperform grid search when there are only a small number of hyperparameters.
  • Bayesian Optimization: A probabilistic model that maps hyperparameter values to the target evaluated on a validation set.
  • Gradient-Based Optimization: Gradient is calculated using hyperparameters and then optimized using gradient descent.
  • Evolutionary Optimization: Advanced techniques that can yield more optimal hyperparameters.

Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects

Login Or Register

cropped-CogHeadLogo.png

Register to View Event

cropped-CogHeadLogo.png

Get The Hyperparameter Tuning

cropped-CogHeadLogo.png

AI Best Practices

Get the Step By Step Checklist for AI Projects

login

Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!