I’m just getting started with pytorch, but one thing I’m curious about is there a way to (visually) see the autograd tree? I’m computing some intermediate values inside of with torch.no_grad():
blocks, but I want to verify that the code is doing what I think it’s doing when it comes to differentiation.
1 Like
Did you check here?
https://pytorch.org/tutorials/beginner/basics/autogradqs_tutorial.html
For multi steps, it needs the backward(retain_graph=True)
or it forgets the intermediate gradients.
If you want the partial derivatives on each arrow, then I haven’t seen it. But you can just print w
, w.grad()
.