Alternative to TorchScript that supports control flow and autograd

I’m currently relying on torchscript for many models, since I need to execute them from C++. I recently learned however that there are no plans to keep developing JIT.

From the documentation I gather the alternatives are torch.compile, if running in python, and torch.export if running in a python-less environment.

I believe that neither of these are fit for my use case, since I need to call autograd for inference, and my model requieres some control flow. Even if worked to fully get rid of the control flow (unlikely to be possible) the lack of autograd is a dealbreaker.

I wanted to know what are my options moving forward. Is TorchScript essentially deprecated? Will it break or be removed in the future or am I ok to keep relying on it? The only alternative to torchscript I see is to rewrite my code directly using the C++ API, but since I have users that rely on the python code, that would mean duplicating all of the codebase

2 Likes

AFAIK the new thing is AOTInductor:

I am not 100% clear on the usage of autograd here, but I would expect to work in a similar fashion there.

There is AOT Autograd - How to use and optimize? — functorch nightly documentation

And as I understand it, that’s already being called by the export function so it might already work.

torch.export doesn’t seem to be able to export graphs with backward or torch.autograd.grad calls. A simple

import torch


class M(torch.nn.Module):
    def forward(self, x):
        x.sum().backward()
        return x.grad


torch.export.export(M(), (torch.tensor(1., requires_grad=True),))

fails withan unsupported operation error, so I don’t think autograd is a current use case of the AOTInductor. Compiled autograd seems to me to be exclusive to torch.compile

I see what you mean however, it seems like aot_function can be used to get an fx-traced graph of the backward call. If this is something that torch.export can use to generate an exported program, then maybe AOTI can compile it into a blob? That would be a step in the right direction. If this is the case then I haven’t figure out how to pull that off yet. If the capability is there, it would be handy if torch.export.export had a export_backward flag =)