Issue comparing two tensors with isclose

I am working on a classification network that should determine whether two graphs are equal or not. In order to do so, I forward pairs of graphs that are exactly equal (1) or different (0) and generate a vector embedding (of length 64) for each input. The two vectors are then concatenated and forwarded through a classification sub network that makes the final prediction. Note that the embedding generation is carried out in the same way for both inputs.

My issue comes from the fact that I am getting false negative (equal input graphs classified as 0) samples when assessing the accuracy of the model. After some examination, I have seen that when two equal graphs are forwarded to the network, torch.isclose sometimes finds differences between the vector embeddings at some positions, but they should be identical as the vector generation process is exactly the same for both inputs (they use the same layers declared in the network). If I print the embeddings I cannot see differences between them and I made sure to call detach in order to get rid of computational graphs beforehand. Does someone know what may be going on?

To illustrate:

As you can see above, two positions are marked with False in the output of torch.isclose, but the values are the same (-0.0237 at position 15 and 0.0252 at position 27). Obviously, torch.allclose returns False and the euclidean distance between the two vectors is 1.7302e-06. Even though the L2 distance is very low and could be considered 0, the network classifies the sample as 0 (0.49 probability) when it should be 1.