The first approach to artificial neural networks in the 1940s and 1950s that provided a single layer of artificial neurons that could process inputs and provide predictions. The perceptron is the simplest kind of neural network: A single-layer perceptron network consists of a single layer of output nodes and the inputs are fed directly to the outputs via weights and biases. Challenges with perceptrons and neural networks contributed to the first AI Winter, and approaches to neural networks didn’t regain interest until decades later with the development of approaches such as backpropagation and technology advancements such as GPUs and the availability of Big Data that made multi-layer deep learning neural networks feasible.