Grad_fn object is not callable with custom loss function

I had to create a custom loss function and it had to be torch.autograd.Function wrapped around torch.nn.module

I wanted to verify that my code is programmatically correct so I called grad_fn manually for testing purpose as following
loss.grad_fn(torch.tensor(1).float())

When I run above line of code using torch.nn.L1Loss, it correctly returns gradients

However, when I execute the same line with my custom function. I am getting

loss
 tensor(0.2361, grad_fn=<CustomLossBackward>)
<torch.autograd.function.CustomLossBackward object at 0x8211d9240>
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-511-a9c2beed2c03> in <module>
     32 print(loss.grad_fn)
     33 
---> 34 grad = loss.grad_fn(torch.tensor(1).float())
     35 print('grad\n', grad)

TypeError: 'CustomLossBackward' object is not callable

Does anyone know why this error is happening??
I see that CustomLossBackward has type of torch.autograd.function.CustomLossBackward
(<torch.autograd.function.CustomLossBackward object at 0x8211d9240>)

while L1LossBackward is on its own as L1LossBackward
(<L1LossBackward object at 0x82112b470>).

Not sure if this means something