[Resolved] Mini-batch training on PyTorch and Input&Param Size

Hi,

I am new to PyTorch.

I made my own custom model and now run with online training (batch size = 1). As a next step, I want extend the model to take mini-batch. Then I want to study about how to support mini-batch on PyTorch.

  • Does it need to extend parameter size holding for all indentical one propagation?

I think that it is natural to have amount of mini-batch size for input and its label. But regarding parameters, I think the model should not have the size because of use the model and parameter for identical inference after its deploying.

Is my thinking a correct?

Yes, the size of the parameters is independent of the batch size.
Your code is most likely already using “batched” inputs with a batch size of 1, so increasing the batch size should work out of the box.

@ptrblck -san,

Again, thank you for your comment, before running, I did replace constant “1” in loading input with hyper-parameter being set to large number. And now it works (^ - ^).

My model did not work on TensorFlow but now it works fine with PyTorch.

1 Like