Saving warmed up torchscript models directly

Heya I’ve been doing some googling and it looks like the first execution of a torchscript model includes a heavy warmup period. In my case, its almost 5 minutes.

This is pretty painful and I’m wondering if I can directly save the optimized model in the first place so I don’t have to go through this warmup period each time.

Hi, could you try running it on master or the nightly build? For the 1.7 release compilation times were improved.

Sorry to hear that. Unfortunately, we can’t save the warmed-up model today due to a number of optimizations relying on in-memory data structures that can’t be serialized. We have some longer-term infrastructural work planned to improve compile times but it hasn’t landed yet.

Two things that would help us improve the situation:

  1. We recently made a number of improvements to compilation and optimization time, can you try the nightly or the 1.7 RC and see if you get improvements (try running the model a few times).
  2. If possible, can you provide the .pt file for the model that is taking a really long time to warm up, as well as some example inputs? It will help us understand what sort of models are causing long compilation times.

Thanks to both you and eellison for the replies.

Using 1.7 seems to have done the trick! Compile times went from 200s -> 2s. When do you think it will make it to stable?

The 1.7 release is coming soon, I can’t give you an exact date. Stay tuned…