Hyperparameters are configuration settings made before the training of the model and are values that affect the way a machine learning model is configured to learn. Examples of Hyperparameters include the neural network Learning Rate, the Number of Epochs for training, Number of neural network layers, nodes per layer, The “K” in K-nearest neighbors, Number of clusters in a clustering algorithm, Number of branches in a decision tree, Ensemble configuration, and other similar user-defined settings. Choices of hyperparameters have significant impact on the model’s bias and variance, its overfitting and underfitting, and accuracy, precision, and other measures. Hyperparameters are configured by humans whereas parameters (such as bias, weight) for a given machine learning model are learned by the machine.