Torch scatter non-deterministic behavior with not unique indices

scatter documentation warns that “When indices are not unique, the behavior is non-deterministic (one of the values from src will be picked arbitrarily) and the gradient will be incorrect (it will be propagated to all locations in the source that correspond to the same index)!”

Given src tensor and index tensor with not unique indices, is there a way to make scatter deterministic? I don’t care about gradients.
I tried to make values at duplicated indices the same (I need max) but failed with it after a few hours…

Do you know any solution? I know how to do it with for loops only.