for epoch in range(EPOCH):
for x, t in dataloader_train:
The dataloader_train includes input x with mini-batch size image and t as a scalar label. The code as above, I think same (x, t) is multiply loaded by EPOCH, so same sequence is repeated EPOCH times. If we use random sampling for each iteration for “epoch” then it is OK, I think, but such the above code is not appropriate code.
Is my thought a correct? And if it is so must I rewrite as follows?
for epoch in range(EPOCH):
dataloader_train = load_randomly()
for x, t in dataloader_train:
Thank you for your comments. So it implies that “EPOCH*BATCH_SIZE” should be equal to or less than the total number of training samples in the data set, right?
no its not like that. Actually the EPOCH has nothing to do with the BATCH_SIZE.
Let me put it this way…
lets say we have 2560 training examples. We choose the BATCH_SIZE of 256, so when we run for x, t in dataloader_train:
This loop runs for 10 times (2560 examples/256 batch size)
This is one forward pass ove rall the 2560 training examples, also known as 1 EPOCH
Now if we want do repeart the same thing for 50 times, we set put it in another loop, like…
for epoch in range(50):
for x, t in dataloader_train:
where our EPOCHS = 50
I hope this makes sense
feel free to ask if you stilll have doubts
Thank you very much for your advises !, your comments make me clear about it (^ - ^).
I wonder is that this is quite different from ordinary other programming languages which takes sequential execution.