Hi There,
Please consider the following case:
aa = tensor.FloatTensor([10,10])
aa.requires_grad = True
y = net(x)
optimizer = optim.SGD(net.parameters(), lr=0.01, momentum=0.9)
loss = aa * creterion(y)
......
Suppose we want to minimize loss by alternating update aa and parameters in net(). How to implement with backward() and step() function?
Thank you so much!