Dear all:
I know the typical scenio
optimizer.zero_grad()
loss.backward()
optimizer.step()
BUT, when my gradients is obtained by grads = autograd.grad(loss, vars_list)
, in that case, how to apply gradients update using Adam??
grads = autograd.grad(loss, vars_list)
for p, g in zip(vars_list, grads):
p.grad.fill_(g)
optimizer.step()
Is this right?