Backpropagation

Backpropagation

Goal #

Update the Model Parameter of a Neural Network based on the result of a Loss Function . The necessaray operations can be derived from the Computational Graph of the Neural Network .

For a one layer NN and no Activation Function and the MSE-Loss function1

backpropagation_d55e51008e1317c18f3c6ba054a5cc67afb79105.svg

Where backpropagation_5230af11fe207645a1a8785e0fabc410f30627d4.svg. The gradient is given by:

backpropagation_5b997daa75a1adedc7d43b58658d8bdcc41cbe5a.svg

For a more real multi-layer neural network

backpropagation_d223b1b511bcdb3d1b7156415431d00959bea73a.svg

the formulation of the loss funciton is more complicated and therefore the gradient backpropagation_afa15e9e31512dfd8a48f4b4e4632d6247503611.svg of it also involves calculating the chain rule multiple times. This can be done more easily with the help of the underlying Computate Graphs .


  1. See MSE-Loss in: Loss Function  ↩︎

Calendar October 22, 2023