Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms

Rectified Linear Unity (ReLU)

A simple and fast non-linear activation function defined as ReLU(x) = Max(0,X). ReLU converges 6x faster than the sigmoid and tanh activation functions.

Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects

login

Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!