Batch normalisatio

I am a beginner to pytorch. I wanted to use BatchNorm1d in my code. wondering what would be the batch size by default as I could not find any user specified input for batch size.

PyTorch modules do not depend on the batch size, so you could pick this value.
However, as is shown in a lot of posts in this forum, a too small batch size might yield skewed running stats and your model might perform poorly on the validation and test set. The GroupNorm paper explains this in more detail and proposes the nn.GroupNorm layer.

Thanks for your reply. But I am still not able to figure out where to specify batch size.
In this , torch.nn.BatchNorm1d (num_features,eps=1e-05, momentum=0.1,affine=True,track_running_stats=True) where to specify batch size as there is no provision for specifying that

Batch Normalization does not need batch size. The learnable of batch normalization are of shape 2xC where C is the channel dimension of a 3D input (NxCxL) and the 2 is for the mean and variance at each channel. If the input is 2D then the learnable parameters of batch norm will be of shape (2x1).

Batch Norm will compute the mean and variance across N where N is the batch size so it is an unimportant parameter for batch normalization.