Close this search box.

Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms

Rectified Linear Unity (ReLU)

A simple and fast non-linear activation function defined as ReLU(x) = Max(0,X). ReLU converges 6x faster than the sigmoid and tanh activation functions.

Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects

Login Or Register


Register to View Event


Get The Rectified Linear Unity (ReLU)


AI Best Practices

Get the Step By Step Checklist for AI Projects


Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!