Hi, I find that after I use the
transforms.resize() the value range of the resized image changes.
a = torch.randint(0,255,(500,500), dtype=torch.uint8)
a = torch.unsqueeze(a, dim =0)
compose = transforms.Compose([transforms.ToPILImage(),transforms.Resize((128,128))])
a_trans = compose(a)
torch.Size([1, 500, 500])
The original range is [0,255], after the transforms.resize(), the value range change to [79,179]
I want to do the resize without the change of value range, someone could help? Thank you
The problem is solved, the default algorithm for
torch.transforms.resize() is BILINEAR
SO just set
Then the value range won’t change!
did you get this error?
UserWarning: Argument interpolation should be of type InterpolationMode instead of int. Please, use InterpolationMode enum.
It still trains, not too sure what does this error mean.
This warning points to this PR, which seems to have introduced the
InterpolationMode argument, since
torchvision.transforms are supporting
PIL.Images as well as tensors now (at least the majority, if I’m not mistaken).
CC @vfdev-5 to correct me.
Can you please see this: Regarding transforms.resize and drastic changes in accuracy, I have the same question regarding which is better or a preferred way to resize an image. Thanks in advance, Sriram Na.