Hi,
I am new to PyTorch.
I made my own custom model and now run with online training (batch size = 1). As a next step, I want extend the model to take mini-batch. Then I want to study about how to support mini-batch on PyTorch.
- Does it need to extend parameter size holding for all indentical one propagation?
I think that it is natural to have amount of mini-batch size for input and its label. But regarding parameters, I think the model should not have the size because of use the model and parameter for identical inference after its deploying.
Is my thinking a correct?