Can I set the batch_size of lstm model to be None like tf in pytorch

I want to create a nn model by pytorch to implement reinforcement.

I need to use this model to interact with the environment, so I want to set the batch_size=1.

But the model cannot perform well, so I want to train the model firstly by supervised learning in order to enhance the performance of my model, so I want to set the batch_size>1(like 64) to accelerate the training process.


In pytorch, the batch size is not a property of the model.
You can give an input of any batch size as input and it will give you the corresponding output with the same batch size.