Does batched_nms in torch support backpropagation?

Hi I have a question about the backpropagation of batched_nms in pytorch.

Some of the works have used nms operation inside the model forward pass and I wonder whether the computation graph is all connected between the input and output of nms.


Yes, this should be the case and you can verify it by passing an input with requires_grad=True and checking the .grad_fn of the output, which should show a valid backward function. I’m not at my workstation right now as I would quickly verify it.