Autograd row-wise of a tensor using PyTorch autograd and without for loop

Hi, I have a same question, but I need to just get one layers gradient to save some time in my model. But the function ‘jacrev’ can’t let me to choose the exact layer’s parameter to backpropagate. Autograd on a specific layer's parameters This is my full problem.