A simple and fast non-linear activation function defined as ReLU(x) = Max(0,X). ReLU converges 6x faster than the sigmoid and tanh activation functions.
Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!