How can I resize image tensors?

I have a tensor - batch of images with shape [32, 3, 640, 640] and values in the range [0, 1] after diving by 255. How can I resize that tensor to [32, 3, 576, 576]?

I see the option torch.nn.funtional.interporlate and the documentation wrote The input dimensions are interpreted in the form: mini-batch x channels x [optional depth] x [optional height] x width. I am not clear this part about [optional depth] x [optional height] x width because in my case I have 576x576. Thanks.

This should work:

x = torch.randn(32, 3, 640, 640)
y = F.interpolate(x, (576, 576))
print(y.shape)
# torch.Size([32, 3, 576, 576])

Thanks.
It worked properly after comparing with OpenCV.