How to normalize the convolution parameters during forward pass?

Hi, I have a question which confused me a lot.
I want to normalize the network parameters in every forward time, how should i do?

import torch
import torch.nn as nn
class mynet(nn.Module):
     def __init__():
          self.conv1 = nn.Conv2d(128,128,3,1,1)
     def forward(input):
          self.conv1.weight /=torch.norm(self.conv1.weight, p=2, 1,keepdim=True)
          output = self.conv1(input)
     return output

I think i should do like this, but i am not sure about it. Please help me. Thanks

Changing a parameter during the forward isn’t usually done.
There are two more natural things you could do:

  • Normalize the parameters before applying the model
  • Normalize the weight on the fly. One way to do this is to derive nn.Conv2d. In the forward pass, do weight = self.weight / ... and pass the local weight to the torch.nn.functional.conv2d. This would not normalize the stored parameters (but you could do that after training and then replace your custom conv layer with the standard one).

Note that the two strategies are fundamentally different and it you’d need to find out which gives you better results.

Best regards

Thomas