Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

My question is about a training set in a supervised artificial neural network (ANN)

The training set, as some of you probably know, consists of pairs (input, desired output)

Training phase itself is the following

for every pair in a training set

-We input the first value of the pair and calculate the output error i.e. how far is the generated output from the desired output, which is the second value of the pair

-based on that error value we use backpropagation algorithm to calculate weight gradients and update weights of ANN

end for

Now assume that there are pair1, pair2, ...pair m, ... in the training set

we take pair1, produce some error, update weights, then take pair2, etc.

later we reach pair m, produce some errors, and update weights,

My question is, what if that weight update after pair m will eliminate some weight update or even updates which happened before?

For example, if pair m is going to eliminate weight updates happened after pair1, or pair2, or both, then although ANN will produce a reasonable output for input m, it will kinda forget the updates for pair1 and pair2, and the result for inputs 1 and 2 will be poor, then what's the point of training ??

Unless we train ANN with pair1 and pair2 again, after pair m

1 Answer

0 votes
by (108k points)

Whenever you're performing supervised training, you should run several (or even thousands) rounds through a training dataset.

The training set can be made easily directly from the time series. A certain number of measured values is used as inputs and the value to be predicted (i.e., the value in the future, in some chosen distance after these input measured values) is used as required output. Input part of the time series is called the window, the output is the predicted value. By shifting the window over time series the items of the training set are made.

There are also two different ways of updating the parameters in the neural network, during supervised training. Stochastic training and batch training. Batch training is a loop within the dataset, accumulating the total error through the set, and updating the parameters (weights) only once when all error has been accumulated. Stochastic training is the process you describe, where the weights are adjusted for each input, the desired output pair.

If you wish to learn about Neural Network then visit this Neural Network Tutorial.

Browse Categories

...