Here is a very simple example to understand my question:
import torch a = torch.arange(5).cuda() b = torch.arange(5).cuda() print(a.__hash__()) print(b.__hash__())
If you run this you will see that the hash value of a is different from the one of b which is weird to me. Is there any explanation for this? And is there a way to avoid this behavior? (I would like to update a dictionary with tensor valued keys and because of this, the keys will appears multiple times).
Thanks a lot in advance for your help!