Is there any way to add noise to trained weights?

Hello I’ve got a trained network.
Is there any way to add noise(for example gaussian) to trained weights?
thank you

You could use this sample code to add gaussian noise to all parameters:

with torch.no_grad():
    for param in model.parameters():
        param.add_(torch.randn(param.size()) * 0.1)

If you want to skip certain parameters, you could use model.named_parameters() and filter out all unnecessary parameters.

2 Likes

Thank you very much!

Dear Ptrblck,

I only want to add the noise to the weights in each epoch, Do you have a more convenient way to do that, instead of filling other parameters one by one?

Moreover, I am not quite sure how to use this piece of code. Adding these codes to the training loop?

Could you provide more specific information how to use them?

You would somehow have to access each parameter, but a custom method and the usage of model.apply might be more convenient:

def add_noise_to_weights(m):
    with torch.no_grad():
        if hasattr(m, 'weight'):
            m.weight.add_(torch.randn(m.weight.size()) * 0.1)

model.apply(add_noise_to_weights)

It depends on your use case, so you could call this method after each iteration, in each epoch, etc.

1 Like

Thank you very much and I would try this

Hi, I’m wondering why you added ‘torch.no_grad()’ at the top. Is it to prevent the parameters from updating? So if I wish to train the parameters aftering adding the gaussian noise, should I not include the no grad argument?

Thanks!

This is just a guard to make sure this particular operation is not recorded by Autograd.
It’s most likely not necessary at this point, but I’m used to write it in that way.
The parameters will still be trained. Just the addition using the Gaussian noise won’t be recorded.

1 Like

When I try this method I get an error in the back propagation step

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [1, 1]], which is output 0 of TBackward,.....