In its most basic form, a neural network is a set of multilayered,
interconnected nodes, often referred to as neurons. Each neuron receives input
from neurons in the previous layer, performs a computation, and passes the
result to the neurons in the next layer. The simplest neural network, a
single-layer perceptron, consists of a set of input nodes, each connected to the
output node. To address more complex tasks and learn non-linear relationships,
multi-layer neural networks can be used. These networks consist of multiple
layers of neurons, including one or more hidden layers between the input and
output layers. These networks are often referred to as deep neural networks,
given their multiple hidden layers.