Hi all,
I have a network “net” which is producing an image as output:
b = net(a)
The type of “b” is “torch.cuda.FloatTensor of size 1x3x256x256”. Now I want to resize the image “b” into 1x3x224x224.
To do that I am defining my transforms as follows:
transform_list_classifier = [transforms.ToPILImage(), transforms.Resize((224,224)),transforms.ToTensor(),transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
transform_classifier = transforms.Compose(transform_list_classifier)
and then I am applying my transform as follows:
input_classifier = transform_classifier(torch.Tensor((b.data).cpu()))
this input classifier is supposed to be torch.cuda.FloatTensor of size 1x3x224x224
I am getting an error pic should be Tensor or ndarray. Got <class ‘torch.FloatTensor’>.
Is there any way to directly resize a torch.cuda.FloatTensor of size 1x3x256x256 to torch.cuda.FloatTensor of size 1x3x224x224 ???
THANKS IN ADVANCE