# Cdist does not return zero for distance between same vectors

Hi:

## Sample code

``````a = torch.rand(64, 784) # simulating a batch of flattened 28x28 images
print(a.shape, a)
dist = torch.cdist(a, a, p=2)
print('Diagnol distances')
for i in range(10):
print(dist[i,i])
``````

## Result

``````torch.Size([64, 784]) tensor([[0.7266, 0.4859, 0.2753,  ..., 0.2172, 0.7718, 0.1553],
[0.3704, 0.5248, 0.3265,  ..., 0.5382, 0.1589, 0.8711],
[0.4320, 0.1686, 0.0767,  ..., 0.0733, 0.2244, 0.4947],
...,
[0.8390, 0.0061, 0.2814,  ..., 0.4127, 0.4423, 0.3151],
[0.3753, 0.3822, 0.8913,  ..., 0.8308, 0.0026, 0.7139],
[0.1975, 0.2592, 0.4194,  ..., 0.5257, 0.4047, 0.2934]])
Diagnol distances
tensor(0.)
tensor(0.0055)
tensor(0.0055)
tensor(0.0068)
tensor(0.)
tensor(0.0078)
tensor(0.)
tensor(0.)
tensor(0.0055)
tensor(0.)
``````

I would expect the diagnol distances to be 0.0? But I get values such as 0.0055 which is close to zero but not quite? Why is this the case?

Thanks!

1 Like

You might be hitting the below issue.

UPDATE: there is an option `compute_mode='donot_use_mm_for_euclid_dist'` in `cdist()` to not use `matmul` while computing `cdist()`. Its able to give `0` distance in diagonals.