Search
Close this search box.

Glossary of Artificial Intelligence (AI), Machine Learning (ML), and Big Data Terms

Rectified Linear Unity (ReLU)

A simple and fast non-linear activation function defined as ReLU(x) = Max(0,X). ReLU converges 6x faster than the sigmoid and tanh activation functions.

Get Certified on the Proven Path to Success with AI, Big Data & Analytics Projects

Login Or Register

cropped-CogHeadLogo.png

Register to View Event

cropped-CogHeadLogo.png

Get The Rectified Linear Unity (ReLU)

cropped-CogHeadLogo.png

AI Best Practices

Get the Step By Step Checklist for AI Projects

login

Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!