Behaviour of nn.utils.weight_norm during inference

Hello,

I know the utility of weight_norm is quite simple to use in PyTorch

from torch.nn.utils import weight_norm
weight_norm(nn.Conv2d(in_channles, out_channels))

However, does it do the weight re-parameterization even during inference when model is set in eval() mode and running inside torch.no_grad() or it just do during training and freezes the weights during inference? I could not find it in the documentation or by checking the source code on GITHUB.

At the time of inference should I explicitly inactivate weight normalization using remove_weight_norm() or not?

Thankyou