I am using scripting to convert a model to TorchScript module. How can I disable the grad-calculation in predict method of the module?
I tried the standard no_grad()
and set_grad_enabled()
options. But it seems both are not supported in a jit.script_method
.
1 Like
Did you find a solution? Iām facing the same problem and getting OOM error (I can see when I loop over with torch.no_grad():
that the GPU memory increases)