Torch.from_numpy not support negative strides

When use torch.from_numpy from the Ndarray with negative strides, there is a runtime error
’RuntimeError: some of the strides of a given numpy array are negative.'
For example,

import numpy as np
x=np.random.random(size=(32,32,7))
torch.from_numpy(np.flip(x,axis=0))

RuntimeError: some of the strides of a given numpy array are negative. This is currently not supported, but will be added in future releases.

Same error with np.rot90()

how about

torch.from_numpy(np.flip(x,axis=0).copy())

Thanks for your recommendation, I have solve this problem.

also works for me, thanks! though i don;t know why…

ndarray.copy() will alocate new memory for numpy array which make it normal, I mean the stride is not negative any more.

Excuse me, I’m puzzled about the word ‘normal’, what do you mean by ‘normal
numpy array’?

it’s like something contiguous in tensor. the data is stored orderly.

hello,

I just don’t understand what is “negative stride” means, could you please interpret it for me? Thank you very much!!

This means that your numpy array has undergone such operation:
image = image[..., ::-1]
I guess this has something to do with how numpy array are stored in memory, and unfortunately PyTorch doesn’t currently support numpy array that has been reversed using negative stride.

A simple fix is to do

image = image[..., ::-1] - np.zeros_like(image)

Ok, I understand it. There is a function in numpy like contiguous in pytorch. You can try it. Thanks.




信工 刘蓬博

邮箱:pengbo18555@163.com

签名由 网易邮箱大师 定制

If you don’t want to flip the image, if for example you have already trained a network with un-flipped images, then you can save and load the image before passing it for inference.

I think this is a more elegant solution to the problem leveraging the PyTorch API.

torch.flip(torch.from_numpy(x), dims=(0,))

Thanks! It does works. :partying_face:

One can also pin the memory again by

x = np.flip(x,axis=0)
torch.from_numpy(np.ascontiguousarray(x))