Hash dataset (MNIST)

I have an array of 60,000 MNIST images. I am planning on hashing ( I will probably use the string representation of the tensor image) the dataset into a dictionary where key = hash and value = index in dataset. Is there a way to do this without looping over the entire dataset? I wonder because this seems to be lending itself to parallelization.