I think I've understood each step of the backpropagation algorithm but the most important one. How do weights get updated? Like at the end of this tutorial? __http://home.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html__

0 votes

__http://home.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html__

0 votes

Neural network training is about finding weights that minimize prediction error. In general, we start our training with a set of randomly generated weights. Then, backpropagation is used to update the weights in an attempt to correctly map arbitrary inputs to outputs.

Backpropagation, short for “backward propagation of errors”, is a mechanism used to update the weights using __gradient descent__. It calculates the gradient of the error function with respect to the neural network’s weights. The calculation proceeds backward through the network.