class Net(nn.Module):
def init(self):
super(Net, self).init()
self.fc1 = nn.Linear(n_in, n_out)
def forward(self, x):
x = self.fc1(x)
x = nn.Tanh()(x)
return x

model = Net()

I want to change some weights by myself. I try:

model.fc1.weight[0][0] = 0.5

This changes in model.fc1.weight:
requires_grad=True to grad_fn=CopySlices>
Also if i now train the model i get the message: ValueError: canβt optimize a non-leaf Tensor.
Can someone please help how to do it ?