I got the following issue. Kindly help

This error is raised if you are trying to call backward on a tensor, which isn’t attached to any computation graph.
Common issues are:

  • explicit detaching the tensor via tensor.detach() from the computation graph
  • disabling gradient calculation e.g. via torch.no_grad() or by globally disabling Autograd
  • using 3rd party libs (e.g. numpy) without writing a custom atograd.Function and its backward method
  • implicitly detaching by rewrapping a tensor via x = torch.tensor(x)