How to convert an image with dimensionality HxWxC to CxHxW

Hi, I’m stuck in how to convert an image with dimensionality HxWxC to CxHxW, where C is the channel, H is the height and W is the width.
Much thanks for your help.

Well, I get the answer by reading the document. Thanks for your attention.

For posterity:
torch.permute(2, 0, 1) will do the job

5 Likes

@richard Thanks for your excellent suggestion. I converted the tensor to numpy and then used numpy.transpose, but torch.permute will do more excellent job.:grinning:

1 Like

@richard Hi, richard. Is there some operation in torch that can rotate tensors just like rot90 in numpy? Converting between tensors(or Variables) and numpy arrays may be time consuming and cumbersome so I just would not like to use numpy :thinking:

If you are working on data augmentaion, I recommend trying out torchvision.transforms.RandomRotation. Otherwise, it’s easier to work around in Pillow.

@Fnjn Umm, I am not working on data augmentation. It’s something on a custom function.

EDIT: But I think I could use the source code of torchvision.transforms.RandomRotation for reference. Thanks a lot.

@AlbertZhang not that I know of, but it could be something interesting to add to the API. Were you thinking of arbitrary rotations (I’m not sure how that would make sense…), or just rot90/270/180?

@richard just rot90/180/270. I don’t think arbitrary rotation will make sense for matrices.

EDIT: BTW, what would the derivative of such rotation operation with respect to the matrix it handling on be like? A corresponding rotation operation? It is beyond my understanding now. :thinking:

You can read about Spatial Transformer Networks

@cakeeatingpolarbear It may be helpful. Thanks a lot.

@AlbertZhang the derivative would be a rotation in the opposite direction (of the gradOutput).

@richard Sorry for the delaying of reply. Adding the API for tensor rotation would be great. Thanks so much for your help and great job.

I’ve opened an issue for that here: https://github.com/pytorch/pytorch/issues/6271 , hopefully someone will get to it.

simple and efficient :+1: