How the backward() works for torch Variable?

Note that if you’re calling it on a loss/cost variable, you don’t need to provide an argument since autograd will assume a tensor of ones.

1 Like