Gradient is None after backward

Where exactly is out coming from?

If u r training a model then I’m assuming that out is the output of ur model which in that case the code is supposed to be:

loss.backward()

Not

out.backward()