Interpreting the backward gradients using register backward hook

Hi
So I have a model say M. I pass an input batch to the model with batch size B. I have also put a register_backward_hook in one of the components of the module. Now I want the contribution of the first data point of the batch to the gradient in the module. How do I go about that? The gradient is always of size B x Size of the module. Is the contribution gradient[0, :, :, :]?

The backward pass code looks like

out = M(inp)
out[0, 0].backward(retain_graph=True)

out[0,0] means the score of the first class in the first data point.

Any help is appreciated.