0 votes
1 view
in AI and Deep Learning by (18.7k points)

Suppose I have a very big train set so that Matlab hangs while training or there is insufficient memory to hold train set.

Is it possible to split the training set into parts and train the network by parts?

Is it possible to train the network with one sample at a time (one by one)?

1 Answer

0 votes
by (41.4k points)

Incremental training in MatLab can be implemented to both static and dynamic networks, although it is more generally used with dynamic networks, such as adaptive filters. You can just manually divide the dataset into batches and train them one after one:

for bn = 1:num_batches

    inputs = <get batch bn inputs>;

    targets = <get batch bn targets>;

    net = train(net, inputs, targets);


Though batch size should be greater than 1, anyway that should reduce memory consumption for training.

For more information refer to the following link: https://in.mathworks.com/help/deeplearning/ug/neural-network-training-concepts.html;jsessionid=d510f3808184666f0514bfe27e13