Where to begin developing custom backend for torch.compiler?

I am looking for references to understand how dynamo works and how exactly can I write a custom backend for torch.compiler. For example, when we invoke the following function:

torch.compiler.list_backends()

provides the following details:

['cudagraphs', 'inductor', 'onnxrt', 'openxla', 'openxla_eval', 'tvm']

How can I add a custom backend to this list? How to begin?

We are removing most of those backends out of core in early december see this RFC for more details [RFC]: Moving most torch.compile backends out of core by 12/1/23 · Issue #109687 · pytorch/pytorch · GitHub

To write a custom backend follow the docs here Custom Backends — PyTorch main documentation and here’s a case study of Hidet doing this Introducing Hidet: A Deep Learning Compiler for Efficient Model Serving | PyTorch

1 Like