I have a variable `a = Variable(torch.Tensor(5,5))`

, is there any way to calculate the determinant of that variable?

```
import torch
import numpy as np
from torch.autograd import Variable
a = Variable(torch.randn(5,5))
np.linalg.det(a.data.numpy())
```

In fact, I want to get the gradient of the det w.r.t. each elements in the matrix.

I believe the results would be numerically unstable for a large matrix.

There was some discussion on TensorFlow github issues to that effect.

Itâ€™s sad that PyTorch does not have a determinant function. This basically makes it impossible (ok, very hard) to implement Gaussian mixture density networks with a full covariance matrix.

Hi @dpernes,

the happy news is that between the last post and yours, we added documentations to make the Cholesky functions easier to find, so

```
torch.potrf(a).diag().prod()
```

gives you the determinant. (The functions are exactly the same as in 0.1.12, too, but be sure to use the master docs.)

If you want a differentiable version, you could make a `Cholesky`

layer by combination with inverse (in lieu of having triangular solving in autograd) and then do `Cholesky.apply(v).diag().prod()`

.

Although the notebook is not as finished as I would like, there is a Cholesky layer in my notebook doing most basic Gaussian Process regression.

Best regards

Thomas

when i use torch.potrf(a).diag().prod() i got an TypeError: Type Variable doesnâ€™t implement stateless method potrf

error

i can not use this function with new version of pytorch

Either use the above on Tensors or Cholesky.apply with the linked austograd Function.

Best regards

Thomas

sure i can use this code torch.potrf(a).diag().prod() when a is a tensor but i need to do the operation to with autograd when i call backward() function. Would you please help me solve this problems

Hi,

the `Cholesky`

class from the notebook linked above is a (not terribly good because it uses â€śinverseâ€ť instead of a triangular solver on a matrix we know to be triangular) `autograd.Function`

that does the same as `potrf`

does on Tensors. `diag`

and `prod`

should work on Variables.

Note that this only works for positive definite matrices (e.g. covariance matrices).

Best regards

Thomas

Thanks for your help and i found that i got the square of the determinant with the code.

If anyone is looking at this thread: Note that potrf has gained differentiability in master/0.3.

Best regards

Thomas

However, we still have to use `torch.inverse`

.

There is `torch.potrs`

if you have a system with the symmetric matrix and the general solver `torch.gesv`

if you want to solve something with the factor as matrix (a triangular solve would be nixe, of course). In my candlegp Gaussian Process lubrary I caught myself forgetting to specify `upper=False`

to get the lower factor - that was a greater nuisance than the solverâ€¦

Best regards

Thomas

@tom I got many problems when using these Lapack functions (for example). The Cholesky decomposition usually throws out `the leading minor of order ... is not positive definite`

error, even with high jitter level (1e-5, sometimes for 1e-4).

Your `candlegp`

library is really nice! What do you think about using pyro with it? I make a simple version here (https://github.com/fehiepsi/pytorch-notebooks/blob/master/executable/GaussianProcess.ipynb, sorry for not putting any comment in the code ).

Just for anyone who is still looking for this: after version 0.4, we can compute the determinant of a matrix using `torch.det(A)`