The problem is solved, the default algorithm for torch.transforms.resize() is BILINEAR
SO just set transforms.Resize((128,128),interpolation=Image.NEAREST)
Then the value range won’t change!
This warning points to this PR, which seems to have introduced the InterpolationMode argument, since torchvision.transforms are supporting PIL.Images as well as tensors now (at least the majority, if I’m not mistaken).