How to add gradient noise?

Hello! everyone!

my pytorch version is 1.10.2.

I want to add noise to the gradient of Adam optimizer using torch.normal.

But grad tensor is variable. so how can i add the normal distribution noise to gradient?

You can directly add the noise to the gradient via:

loss.backward()
model.layer.weight.grad = model.layer.weight.grad + torch.randn_like(model.layer.weight.grad)
optimizer.step()
1 Like

Thanks!

then, couldn’t add the normal distribution noise with mean and standard deviation?

You can add the mean and scale with the stddev as:

... + (torch.randn_like(model.layer.weight.grad) * stddev + mean)

Thanks!! i’ll try this!