Neural Network
Neural Network
Neural Network
NEURAL NETWORK
However, as computers have become increasingly powerful in recent years, the complexity of ANNs has likewise
increased such that they are now frequently applied to more practical problems such as
• The Speech and handwriting recognition programs like those used by voicemail transcription services and
postal mail sorting machines .
automation of smart devices like an office building's environmental controls or self-driving cars and self-
piloting drones .
• Sophisticated models of weather and climate patterns, tensile strength, fluid dynamics, and many other
scientific, social, or economic phenomena.
ANNs are versatile learners that can be applied to nearly any learning task: classification, numeric
prediction, and even unsupervised pattern recognition
From biological to artificial neurons
• An activation function, which transforms a neuron's net input signal into a single output signal to be
broadcasted further in the network
• A network topology (or architecture), which describes the number of neurons in the model as well as
the number of layers and manner in which they are connected
• The training algorithm that specifies how connection weights are set in order to inhibit or excite
neurons in proportion to the input signal
Activation functions
Backpropagation is an algorithm for supervised learning of artificial neural networks using gradient descent.
Given an ANN and error function , the method calculates the gradient of the error function with respect to the neural
network weightd
How Backpropagation Algorithm Works
1. Inputs X, arrive through the preconnected path
2. Input is modelled using real weights W. The weights are
usually randomly selected.
3. Calculate the output for every neuron from the input
layer, to the hidden layers, to the output layer.
4. Calculate the error in the outputs
5. Travel back from the output layer to the hidden layer to
adjust the weights such that the error is decreased
A backward phase in which the network's output signal resulting from the forward phase is compared to the
true target value in the training data. The difference between the network's output signal and the true value
results in an error that is propagated backwards in the network to modify the connection weights between
neurons and reduce future errors.
Why We Need Backpropagation?
(Advantage)
• Backpropagation is fast, simple and easy to program
• It has no parameters to tune apart from the numbers of input
• It is a flexible method as it does not require prior knowledge
about the network
• It is a standard method that generally works well
• It does not need any special mention of the features of the
function to be learned.
Click icon to add picture