How to apply Adam/RMSprop optimizer when gradients is obtained by autograd.grad?

Dear all:
I know the typical scenio :slight_smile:

optimizer.zero_grad()
loss.backward()
optimizer.step()

BUT, when my gradients is obtained by grads = autograd.grad(loss, vars_list), in that case, how to apply gradients update using Adam??

grads = autograd.grad(loss, vars_list)
for p, g in zip(vars_list, grads):
    p.grad.fill_(g)
optimizer.step()

Is this right?

2 Likes

Hi,

Yes I think that is the right way to do it.

2 Likes