Hi, all.
I want to exchange the gradients of two networks.
then this code works fine?
loss.backward()
for p,q in zip(model1.parameters(), model2.parameters()):
p.grad = q.grad
q.gard = p.grad
optimizer.step()
Hi, all.
I want to exchange the gradients of two networks.
then this code works fine?
loss.backward()
for p,q in zip(model1.parameters(), model2.parameters()):
p.grad = q.grad
q.gard = p.grad
optimizer.step()
Yes, this should be possible (besides the minor typo in q.gard
).
I’m not sure, what the second assignment should do, as the first one would already assign the gradients from model2
to model1
.
The optimizer will just use the .grad
attributes of all parameters without any knowledge how these gradients were created (at least in the default use cases).