In the documentation torch.topk uses four parameter such as topk(self, k, dim=None, largest=True, sorted=True). In case of sorted=False, it gives ascending order sorted elements. This is quite confusing. If sorted=False it should provide first come first serve elements. Essentially index should be sorted.
I met the same error! Hope someone know how to fix it.
The sorted=True
setup guarantees that the results will be sorted, but leaving sorted=False
can return any ordering (including a sorted one) depending on the used device.
That helps a lot! Thank you very much.