About batchsize

In train.py, I set the batchsize to 16. I transferred a batch of data into a customized neural network. In the forward function of the network, I printed the dimension information of the data and found that the batchsize changed to 8. What caused this?

Can u show the code of ur model and how u fed the batch into the model?

Thanks for your reply. I have found the reason of this phenomenon. Because my computer is equipped with two cards, and I used nn.Dataparallel in the program.