zimmer550
(Sarim Mehdi)
November 4, 2019, 4:37pm
1
So, my question is basically a follow-p of this:
Hi. I am training an RNN encoder-decoder to make predictions of the trajectory of agents. I am using the KITTI dataset. The training dataset consists of 21 image folders and each folder has a variable number of images. So, this means that the batch size changes during training, for example:
A batch size of one, when training an encoder, means one sequence. Keeping that in mind, let’s say I take as input 5 images and, while decoding, also works on 5 images. So, that is a total of 10 images. So, …
Is batch-norm suitable for my situation where the batch size can fluctuate between 1 and 10? I know that there is no batchnorm for batch size of 1. So, is that a hard no for my situation or is there something I could do?
SimonW
(Simon Wang)
November 4, 2019, 5:14pm
2
with that low batch size, you are probably better just not use batch norm at all.