Backpropagation
Goal #
Update the Model Parameter of a Neural Network based on the result of a Loss Function . The necessaray operations can be derived from the Computational Graph of the Neural Network .
For a one layer NN and no Activation Function and the MSE-Loss function1
Where . The gradient is given by:
For a more real multi-layer neural network
the formulation of the loss funciton is more complicated and therefore the
gradient of it also involves calculating the chain rule multiple
times. This can be done more easily with the help of the underlying
Computate Graphs
.
-
See MSE-Loss in: Loss Function ↩︎