Strange behaviour with torch.clamp()

This behavior seems strange:

>>> import torch
>>> x = torch.tensor([1,2,3,4,5])
>>> torch.clamp(x, 2, 1)
tensor([2, 1, 1, 1, 1])

Based on the note in the docs, I would have expected all values to be set to the max of 1: torch.clamp — PyTorch 1.10 documentation

Python 3.7.12
Torch 1.5.0

Any thoughts?

So, I’ve just run this I got the expected behaviour.

>>> import torch
>>> x = torch.tensor([1,2,3,4,5])
>>> torch.clamp(x, 2, 1)
tensor([1, 1, 1, 1, 1])

My torch version is 1.11.0.dev20220201+cu111, so this bug has been patched but I’m not sure which version resolved it.

Also, do you think there could be an issue given that the min is larger than the max value? Perhaps repeat it with torch.clamp(x,0,1) and see if that works?