Torch.eq and torch.equal behaviors

MWE below. I have also printed out c and d, so I am pretty sure they have the same elements. Am I misunderstanding something about comparing the difference between two tensors?

import torch
a = torch.rand(3,2,5)
b = torch.rand(3,5,4)
c = torch.stack([a[i] @ b[i] for i in range(3)])
d = torch.einsum('bij,bjk->bik', a, b)
print(torch.all(torch.eq(c,d))) #returns False
print(torch.equal(c,d)) #returns False

Hi Qiyao!

This is due to round-off error.

Your calculations for c and d are mathematically equivalent, but the
order of the operations differ (in a mathematically-equivalent way),
introducing numerical round-off error.

This can be seen by modifying your example to check not just for
exact equality between c and d, but for c and d being nearly equal
to one another by using torch.allclose().

Here is a modified version of your script:

import torch
print (torch.__version__)
_ = torch.manual_seed (2022)
a = torch.rand(3,2,5)
b = torch.rand(3,5,4)
c = torch.stack([a[i] @ b[i] for i in range(3)])
d = torch.einsum('bij,bjk->bik', a, b)
print(torch.all(torch.eq(c,d))) #returns False
print(torch.equal(c,d)) #returns False
print (torch.allclose (c, d))   #returns True

And here is its output:

1.10.2
tensor(False)
False
True

Best.

K. Frank

1 Like