Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Machine Learning by (19k points)

I've worked on a CNN over several hundred GBs of images. I've created a training function that bites off 4Gb chunks of these images and calls fit over each of these pieces. I'm worried that I'm only training on the last piece on not the entire dataset.

Effectively, my pseudo-code looks like this:

DS = lazy_load_400GB_Dataset()

for section in DS:

    X_train = section.images

    Y_train = section.classes

    model.fit(X_train, Y_train, batch_size=16, nb_epoch=30)

I know that the API and the Keras forums say that this will train over the entire dataset, but I can't intuitively understand why the network wouldn't relearn over just the last training chunk.

Some help in understanding this would be much appreciated.

1 Answer

0 votes
by (33.1k points)

You can perform batch training using:

model.train_on_batch(X, y)

and  

model.test_on_batch(X, y)

You can also write a generator that yields batches of training data and use the method model.fit_generator(data_generator, samples_per_epoch, nb_epoch).

If you want to iterate your dataset, you should probably use model.train_on_batch and take care of the batch sizes and iteration yourself.

You should also make sure that the order in which the samples you train your model is shuffled after each epoch. The way you have written the example code seems to not shuffle the dataset. 

Hope this answer helps.

Browse Categories

...