PyTorch 1.5.0
Hello, I am using PyTorch to minimise a cost function. No training involved. I was wondering whether using TorchScript rather than pure Python could have led to a speed up. However, I am not sure whether it is possible to back propagate in TorchScript.
Let us consider the following toy class:
class Example(nn.Module):
def forward(self, x):
y = torch.tensor([0], dtype=x.dtype)
y.requires_grad = True
return y
If I call example_scripted = torch.jit.script(Example())
, I get the following error:
RuntimeError:
Tried to set an attribute: grad on a non-class: Tensor:
The issue seems the requires_grad
. I am wondering whether it is possible to use TorchScript to back propagate. Is it?
Please see also https://github.com/pytorch/pytorch/issues/40561
Thank you.