Backward function for arbitrary chain of operations

I’ve seen that one can implement their own gradients for a given function inheriting from torch.autograd.Function and implementing the forward and backward static methods.

I would like to know how to have access to the equivalent to the def backward(ctx, grad_output) method for an arbitrary function implemented in pytorch.

According to the Pytorch documentation:

Every operation performed on Tensor s creates a new function object, that performs the computation, and records that it happened.

So I believe that this should be possible to compute automatically.