# All-to-all Element wise l2 distance of vector

Hi all,

I want to calculate the l2 distance among all elements of two vector. naive version code is:

``````import torch
n = 4
d = 2
A = torch.randn(n,d)
B = torch.randn(n,d)
C = torch.zeros(n,n)

for i in range(n):
for j in range(n):
C[i][j] = torch.sum(torch.sqrt((A[i] - B[j])*(A[i]-B[j])))
``````

I want to ask is there any easier way to do this calculation at PyTorch?
Thanks!
Jun

Not answeirng your quesiton, but shouldnt your `sum` be inside your `sqrt`?

As far as answering your question, isnt it just approximately the Grammian? so you can do something like:

``torch.sqrt(a @ b.transpose(0,1))``
1 Like

Oh, yeah. It should be inside the sqrt

Well, Thanks a lot, Is @ operator could be used in Variable and do autograd?
I want to implement my custom loss function here.

Looks like `@` works ok on Variables:

``````In : a = torch.autograd.Variable(torch.rand(3,4), requires_grad=True)

In : c = a @ b

In : c
Out:
Variable containing:
0.9375  0.5729
1.3275  0.4193
0.2320  0.1663
[torch.FloatTensor of size 3x2]``````

Cool, but it’s only the gram matrix, and I want to calculate the Euclidean Distance Matrix here (sorry, I did not explain it well before). Is there any better way?

How is this?

``````TwoAB = 2 * A @ B.transpose(0,1)
print(torch.sqrt(torch.sum(A * A, 1).expand_as(TwoAB) + torch.sum(B * B, 1).transpose(0,1).expand_as(TwoAB) - TwoAB))``````