Is there a way to manipulate the gradient in Adam step?

Hi There,

Just curious is there there a way to manually allocate a gradient for an Adam step?

You could use hooks to manipulate the gradients for each parameter (its .grad attribute), backward hooks, or if needed you could also create a custom Adam implementation by reusing the PyTorch implementation.

1 Like