I could send the parameters during the model training but if you print them you’ll see the initiative parameters are there and not the new one. it seems like once you initiated the layer with certain parameters you can just change normally. There must be another way to do it .
If you constuct LayerNorm with elementwise_affine=False it does not have any parameters, and you can use functional interface as Peter suggests. With elementwise_affine=True you can change the batch size, however, it is required that normalized_shape (last dimensions of the tensor) are not changed, because the size of the learnable parameters is fixed when you initialize the module.