You have a fixed training set of your neural network for building a model. The role of an epoch is to train your network on each item of your model i.e. one epoch is one forward pass and one backward pass of all the training examples.
So, if you want to teach your neural network to recognize the letters of the alphabet, 100 epochs would mean you have 2600 individual training trials
For 26 alphabets
//train each alphabet i.e update the weights of neural network;
So, what are the right numbers of epochs?
We cannot predict the right numbers of epochs because it is different for different datasets but yes you can tell the number of epochs by looking at the diversity of your data.
For example: Do you have only black cats in your dataset or is it a much more diverse dataset?
If your data contains only black cats then the epochs will be constant and if your data is more diversified then your number of epochs will vary.
batch size is the number of training examples in one forward/backward pass. The higher the batch size, the more memory space you'll need.
Whereas, an iteration indicates the number of times your algorithm’s parameters are updated and depends upon the context. A single iteration involves the following steps:
Processing the training dataset batch.
Calculating the cost function.
Backpropagation and adjustment of all the weights.
Example: if you have 1000 training examples, and your batch size is 500, then it will take 2 iterations to complete 1 epoch.
Watch this video to learn about Neural Networks: