Function that determines at what threshold a neuron will “activate” and provide an output. The Activation Function transforms the summed weighted input from the node (sum of the weights multiplied by the inputs, and then added to the bias) into an output value that is then passed to the next layer or output. Activation functions are nonlinear functions, such as step-functions and other similar acting formulas. Provides the non-linearity needed to enable complex behavior. Without activation functions, connections of neurons with weights and biases just combine the inputs together in an averaging function. One popular Activation Function is ReLU.