I want to calculate the l2 distance among all elements of two vector. naive version code is:
n = 4
d = 2
A = torch.randn(n,d)
B = torch.randn(n,d)
C = torch.zeros(n,n)
for i in range(n):
for j in range(n):
C[i][j] = torch.sum(torch.sqrt((A[i] - B[j])*(A[i]-B[j])))
I want to ask is there any easier way to do this calculation at PyTorch?
Not answeirng your quesiton, but shouldnt your
sum be inside your
As far as answering your question, isnt it just approximately the Grammian? so you can do something like:
torch.sqrt(a @ b.transpose(0,1))
Oh, yeah. It should be inside the sqrt
Well, Thanks a lot, Is @ operator could be used in Variable and do autograd?
I want to implement my custom loss function here.
@ works ok on Variables:
In : a = torch.autograd.Variable(torch.rand(3,4), requires_grad=True)
In : b = torch.autograd.Variable(torch.rand(4,2), requires_grad=True)
In : c = a @ b
In : c
[torch.FloatTensor of size 3x2]
Cool, but it’s only the gram matrix, and I want to calculate the Euclidean Distance Matrix here (sorry, I did not explain it well before). Is there any better way?
How is this?
TwoAB = 2 * A @ B.transpose(0,1)
print(torch.sqrt(torch.sum(A * A, 1).expand_as(TwoAB) + torch.sum(B * B, 1).transpose(0,1).expand_as(TwoAB) - TwoAB))