Is it possible to back propagate in a TorchScript?

PyTorch 1.5.0

Hello, I am using PyTorch to minimise a cost function. No training involved. I was wondering whether using TorchScript rather than pure Python could have led to a speed up. However, I am not sure whether it is possible to back propagate in TorchScript.

Let us consider the following toy class:

class Example(nn.Module):
  def forward(self, x):
    y = torch.tensor([0], dtype=x.dtype)
    y.requires_grad = True
    return y

If I call example_scripted = torch.jit.script(Example()), I get the following error:

Tried to set an attribute: grad on a non-class: Tensor:

The issue seems the requires_grad. I am wondering whether it is possible to use TorchScript to back propagate. Is it?

Please see also

Thank you.

Yeah, backprop works, but some python code won’t compile. In your (contrived) case, tensor(…).requires_grad_(True) would compile, but you don’t usually need to be explicit about requires_grad, as it propagates with operations, and there is also .detach().

Yes you can train see this book:

1 Like