Does JIT plan to support the differentiation of most operators?And when?

I saw that jit supports the differential ability of some operators (about 80), but the number of ATen’s operators (about 700) is much larger than the current support.
When does the plan support the differential ability of all operators?

We plan to increase the number of symbolic derivatives that we support, but we don’t have a specific timeline for when we’ll reach full coverage. In the meantime, autograd still works for JIT modules, so as long as you set requires_grad on your input tensors you will be able call backward() like in Python.