I want to wrap an existing torch.autograd.Function
without modifying the internal to a custom operator, but the torch.library.register_autograd
api requires backward
and setup_context
functions separately. What is the proper way for turning a torch.autograd.Function
into custom_ops?