Torch transformation using numpy

I want to create torch code for transform image using numpy. I am not sure whether I am doing it correctly?
Here is pytorch code

transform=torchvision.transforms.Compose([
torchvision.transforms.Resize((224, 224)),
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize([0.5820, 0.4512, 0.4023], [0.2217, 0.1858, 0.1705])])
img = Image.open("skin_cancer.jpg").convert('RGB')
img=transform(img)
print(torch.max(img[0,:,:]),torch.min(img[0,:,:]),torch.mean(img[0,:,:]))

I want to reproduce it in numpy

img = Image.open("skin_cancer.jpg").convert('RGB')
img = img.resize((224,224))
img=np.array(img)/255
img=np.moveaxis(img,2,0)
img [0,:,:] = (img [0,:,:]  * 0.5820) / 255 * 0.2217
img [1,:,:] = (img [1,:,:] * 0.4512) / 255 * 0.1858
img [2,:,:] = (img [2,:,:]  * 0.4023) / 255 * 0.1

print(np.max(img[0,:,:]),np.min(img[0,:,:]),np.mean(img[0,:,:]))

I have two queries, Is moveaxis the correct approach to change image dimensions from (224,224,3) to (3,224,224) and is normalization the correct way to replicate as in pytorch.
What is a more elegant way to do this in numpy. Actually, I am using onnx and I don’t want to install torch.

Output produced in numpy
2.563400461673167 1.245668497439571 2.0800457636539056
Output produced in pytorch
tensor(1.6909) tensor(-0.3080) tensor(0.9488)

Yes, I think you could use moveaxis as it’s giving the same results here:

arr = np.random.randn(224, 224, 3)
x = torch.from_numpy(arr).permute(2, 0, 1)
arr_new = np.moveaxis(arr, 2, 0)

print((x.numpy() == arr_new).all())
> True

Your normalization is wrong, since you would need to subtract the mean and divide by the stddev.