My torch version is 1.11.0.dev20220201+cu111, so this bug has been patched but I’m not sure which version resolved it.
Also, do you think there could be an issue given that the min is larger than the max value? Perhaps repeat it with torch.clamp(x,0,1) and see if that works?