Am I using torch.jit correctly?

hi. I am using PyTorch1.0 for the first time since PyTorch 0.2.X.
I have a simple question about torch.jit.

Please check [this code] (https://colab.research.google.com/gist/hellocybernetics/013f7d6fb007df1d8c70161872acce72/pytorch_jit_test.ipynb?authuser=1) at gist.

JIT did NOT improve the speed. Am I using torch.jit correctly?

1 Like

Your code looks reasonable to me. Keep in mind that using the JIT may not necessarily yield big performance increases. Right now, the primary use case for the JIT is running PyTorch models in production without a dependency on Python.

1 Like

Thank you to reply.

Before I came PyTorch1.0, I was using TensorFlow 1.X. Lately, TF have “eager execution mode” which is define by run. In TF2.0, eager execution mode is default, so I think the code will become PYTHONIC like PyTorch therefore the difference in usage feeling will disappear.

Then TF have a great JIT function which translator eager code into a function which works as TF graph internally named tf.function at TF2.0.This JIT function makes eager code which much slower than PyTorch into faster than PyTorch. So, I tried to make pytorch faster with torch.jit. (I think that Pyro uses torch.jit for speed at Variational Inference API.)

If torch.jit is not for speed, don’t we need torch.jit at prototyping in research? If so, when considering production, what is difference between using caffe2 and torch.jit?