Parameter dependent on another parameter

I’m trying to change the weights for a Conv2d after randomly initializing it

self.convs.weight = torch.nn.Parameter(data=w, requires_grad=False)

But this only trains weights for convs and not w (a tensor)

And convs.weight can only be assigned a Parameter if I remember correctly

How do I train w and not convs.weight?

use a functional if you don’t want to keep tracking of the weights. What are you trying to do?

you are making weights to be w, what does it mean training w but not weights

a = torch.nn.Parameter(torch.randn(1, requires_grad=True, dtype=torch.float, device=device))
b = torch.nn.Parameter(torch.randn(1, requires_grad=True, dtype=torch.float, device=device))
c = a + 1
d = torch.nn.Parameter(c, requires_grad=True,)
for epoch in range(n_epochs):
    yhat = d + b * x_train_tensor
    error = y_train_tensor - yhat
    loss = (error ** 2).mean()
    loss.backward()
    print(a.grad)
    print(b.grad)
    print(c.grad)
    print(d.grad)

Printing the gradients out I get
None
tensor([-0.8707])
None
tensor([-1.1125])

What I’m trying to do is learn a and not just d
If that makes sense?