Is it possible to decrease/increase the batch size during training loop assuming I use a DataLoader to fetch my batches?
For example, I can configure the learning rate during training via the scheduler of the optimizer. Is there an equivalent of this for the batch size?
paganpasta
(PaganPasta)
2
Is this something you are looking for ?
1 Like