Strange autograd behaviour: jupyter kernel dies

Hello!
I’ve encountered a very strange behaviour of autograd.
I have to functions, say f(X) and g(X). And I want to compute the gradient of g(f(X)) with respect to X, which is a Variable.

Chunks of code

res = torch.norm(f(X))
res.backward()

and

res = torch.norm(g(X))
res.backward()

work well.

But when I do

res = torch.norm(f(g(X)))
res.backward()

my jupyter kernel dies.

Do you know what is wrong?

I’ll try to write a minimal working example of f and g and add it later.

Could you try running in terminal and see if the results are different?

Thanks a lot for your advice!
In terminal it works well.

That’s pretty strange. Could you try updating jupyter pytorch etc.? Sometimes I also find restarting computer help with jupyter issues.

Thanks a lot! Updating from 2.0.2 to 3.0 really helped!