Algorithmic functions that speed up and optimize backpropagation. Different optimizers are good at different kinds of optimization tasks, with Adam being the most popular. Other optimizers include Stochastic Gradient Descent (SGD), RMSprop, Nadam, Radam, and Ranger.