I was doing a transpose of tensors of rank 3 and according to transpose rule for rank 2 tensors which follow simple matrix transpose rule Aij = Aji. But when i transposed a rank 3 tensor i ended up with a different output given below.
Can someone explain to me how it is working under the hood?
a = torch.tensor(
[[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]],
[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]],
[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
Printing the tensor and the shape gives the following:
print(a)
print(a.shape)
tensor([[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]],
[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]],
[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
torch.Size([3, 3, 3])
Transposing with a.T we get
tensor([[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]],
[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]],
[[1., 0., 1.],
[1., 0., 1.],
[1., 0., 1.]]])