Sub-hessian vector product (how to stop gradient for only some entries of a tensor)

Hi,

I’m trying to compute the sub hessian vector product of a function efficiently. By sub hessian, I mean that I’d like to exctract a sub matrix from the regular hessian and do a vector product with it.

I am aware of the standard trick for computing h.v.p. efficiently and I’d like to do something similar here.

I think one solution could be to stop the gradient for some entries of the tensor X. Hence I could get the “sub” hessian vector product by double differentiation.

Is there a way in pytorch to stop gradient for some entries of a tensor ?

Thanks for your help