Let’s say a index matrix or embedding lookup matrix is not started from 0 to n.
a = torch.randint(10, 100, (10, 10))
old_index = torch.unique(a)
new_index = torch.arange(len(old_index))
is there way to map the new index to the matrix in Pytorch?
Something like this in Pandas
a.map(dict(zip(old_index, new_index)
or numpy
a = torch.randint (10, 100, (10, 10))
b = a.numpy()
old_index = np.unique(b)
new_index = np.arange(len(old_index))
index_map = dict(zip(old_index, new_index))
np.vectorize(index_map.get)(b)