Torch determinant function fails for batched inputs

Hi everyone,

I’ve been following the torch.det(x) example here. Yet, I get the following error and I can’t seem to find a solution online. My code is below,

>>> import torch
>>> A = torch.randn(3,3) #initialise tensor (array) of size 3,3
>>> torch.det(A) #calc determinant
tensor(0.4273)
>>> 
>>> A = torch.randn(3,2,2) #initialise tensor (array) of 3,2,2
>>> A
tensor([[[-0.6285, -0.2763],
         [-0.8588,  1.3266]],

        [[ 1.3004, -0.7479],
         [-1.1130, -0.2943]],

        [[-1.6380, -1.3417],
         [-0.0915,  0.8072]]])
>>> A.det() #calculate determinant along axis 0 (i.e. calc determinant for [i,:,:])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: det(torch.FloatTensor{[3, 2, 2]}): expected a 2D square tensor of floating types
>>> torch.det(A)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: det(torch.FloatTensor{[3, 2, 2]}): expected a 2D square tensor of floating types

This is the same as the example in the link above, yet I get an error? I assume it might be a version issue? (I’ve copied the example from link below for completeness)

>>> A = torch.randn(3, 3)
>>> torch.det(A)
tensor(3.7641)

>>> A = torch.randn(3, 2, 2)
>>> A
tensor([[[ 0.9254, -0.6213],
         [-0.5787,  1.6843]],

        [[ 0.3242, -0.9665],
         [ 0.4539, -0.0887]],

        [[ 1.1336, -0.4025],
         [-0.7089,  0.9032]]])
>>> A.det()
tensor([1.1990, 0.4099, 0.7386])

Versions for my code are: PyTorch version: 1.1.0 with py3.7_cuda10.0.130_cudnn7.5.1_0

Thank you in advance!

When looking at the documentation for torch.det for Pytorch version 1.1.0 they don’t have this feature as an example, so it’s most likely what you mentioned that it’s a feature that was made available at later releases. It seems that they introduced this in 1.2.0 (this is when the docs have an example on this).

1 Like

Hi @AladdinPerzon, thanks for the response! It indeed seems like a version issue! Apologises for the silly question! Cheers!