Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms
Rectified Linear Unity (ReLU)
A simple and fast non-linear activation function defined as ReLU(x) = Max(0,X). ReLU converges 6x faster than the sigmoid and tanh activation functions.
Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects