How to change the paramter's weight value while not turn it into a non-leaf variable?

import torch
a = torch.nn.ConvTranspose2d(1, 1, 1)
a.weight.requires_grad = False

The above code works well, however the below dost not:

import torch
a = torch.nn.ConvTranspose2d(1, 1, 1)
a.weight[0, 0, 0, 0] = 1
a.weight.requires_grad = False

It will compains that RuntimeError: you can only change requires_grad flags of leaf variables.

1 Like

OK, I check the source code torch.nn.init, just use the torch.no_grad:

import torch
a = torch.nn.ConvTranspose2d(1, 1, 1)
with torch.no_grad():
    a.weight[0, 0, 0, 0] = 1
a.weight.requires_grad = False