Tracking gradient

How can I get gradients from backward() or optimizer to track those by comet.ml?

    generator_loss.backward()
    optG.step()
    experiment.log_metrics({'Generator loss': generator_loss.item(),
                            'Critic loss': critic_loss.item()},
                            step = i)

Like this for example

After applying generator_loss.backward() you can access the gradient with input.grad. See gradient section of https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html#sphx-glr-beginner-blitz-autograd-tutorial-py for instance.

1 Like