What Is Backpropagation
What Is Backpropagation
edureka.co/blog/backpropagation
December 7, 2017
Bookmark
Saurabh
Backpropagation:
Backpropagation is a supervised learning algorithm, for training Multi-layer Perceptrons
(Artificial Neural Networks).
I would recommend you to check out the following Deep Learning Certification blogs
too:
But, some of you might be wondering why we need to train a Neural Network or what
exactly is the meaning of training.
Now obviously, we are not superhuman. So, it’s not necessary that whatever weight values
we have selected will be correct, or it fits our model the best.
Okay, fine, we have selected some weight values in the beginning, but our model output is
way different than our actual output i.e. the error value is huge.
Basically, what we need to do, we need to somehow explain the model to change the
parameters (weights), such that error becomes minimum.
One way to train our model is called as Backpropagation. Consider the diagram below:
1/8
Let me summarize the steps for you:
Calculate the error – How far is your model output from the actual output.
Minimum Error – Check whether the error is minimized or not.
Update the parameters – If the error is huge then, update the parameters
(weights and biases). After that again check the error. Repeat the process until the
error becomes minimum.
Model is ready to make a prediction – Once the error becomes minimum, you
can feed some inputs to your model and it will produce the output.
I am pretty sure, now you know, why we need Backpropagation or why and what is the
meaning of training a model.
The Backpropagation algorithm looks for the minimum value of the error function in
weight space using a technique called the delta rule or gradient descent. The weights that
minimize the error function is then considered to be a solution to the learning problem.
0 0
1 2
2 4
0 0 0
2/8
1 2 3
2 4 6
Notice the difference between the actual output and the desired output:
Input Desired Output Model output (W=3) Absolute Error Square Error
0 0 0 0 0
1 2 3 1 1
2 4 6 2 4
Let’s change the value of ‘W’. Notice the error when ‘W’ = ‘4’
0 0 0 0 0 0 0
1 2 3 1 1 4 4
2 4 6 2 4 8 16
Now if you notice, when we increase the value of ‘W’ the error has increased. So, obviously
there is no point in increasing the value of ‘W’ further. But, what happens if I decrease the
value of ‘W’? Consider the table below:
0 0 0 0 0 0 0
1 2 3 2 4 3 0
2 4 6 2 4 4 0
3/8
So, we are trying to get the value of weight such that the error becomes minimum.
Basically, we need to figure out whether we need to increase or decrease the weight value.
Once we know that, we keep on updating the weight value in that direction until error
becomes minimum. You might reach a point, where if you further update the weight, the
error will increase. At that time you need to stop, and that is your final weight value.
4/8
The above network contains the following:
two inputs
two hidden neurons
two output neurons
two biases
We will repeat this process for the output layer neurons, using the output from the hidden
layer neurons as inputs.
5/8
Now, let’s see what is the value of the error:
Consider W5, we will calculate the rate of change of error w.r.t change in weight W5.
Since we are propagating backwards, first thing we need to do is, calculate the change in
total errors w.r.t the output O1 and O2.
6/8
Now, we will propagate further backwards and calculate the change in output O1 w.r.t to
its total net input.
Let’s see now how much does the total net input of O1 changes w.r.t W5?
Step – 3: Putting all the values together and calculating the updated
weight value
Now, let’s put all the values together:
Conclusion:
Well, if I have to conclude Backpropagation, the best option is to write pseudo code for
the same.
7/8
Backpropagation Algorithm:
do
actual = teacher-output(ex)
compute {displaystyle Delta w_{h}} for all weights from hidden layer to
output layer // backward pass
compute {displaystyle Delta w_{i}} for all weights from input layer to
hidden layer // backward pass continued
I hope you have enjoyed reading this blog on Backpropagation, check out the Deep
Learning with TensorFlow Training by Edureka, a trusted online learning
company with a network of more than 250,000 satisfied learners spread across the
globe. The Edureka Deep Learning with TensorFlow Certification Training course helps
learners become expert in training and optimizing basic and convolutional neural
networks using real time projects and assignments along with concepts such as SoftMax
function, Auto-encoder Neural Networks, Restricted Boltzmann Machine (RBM).
Got a question for us? Please mention it in the comments section and we will get back to
you.
8/8