0 votes
1 view
in AI and Deep Learning by (42k points)

I think I've understood each step of the backpropagation algorithm but the most important one. How do weights get updated? Like at the end of this tutorial? http://home.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html

1 Answer

0 votes
by (90.3k points)

Neural network training is about finding weights that minimize prediction error. In general, we start our training with a set of randomly generated weights. Then, backpropagation is used to update the weights in an attempt to correctly map arbitrary inputs to outputs.

Backpropagation, short for “backward propagation of errors”, is a mechanism used to update the weights using gradient descent. It calculates the gradient of the error function with respect to the neural network’s weights. The calculation proceeds backward through the network.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !