What is another method for Jacobian calculation instead of using torch.autograd.functional.jacobian

Hi all I am trying to calculate Jacobian of a matrix which is 1000by1000. When torch.autograd.functional.jacobian is used the cuda arises out of memory error. I need another some method to scape from cuda out of memory error.
Like numpy calculation or any other sample code to calculate Jacobian matrix.

Any help would be appreciated.