Load numpy.ndarray into dataset, how to not load out tensor

Hi,
I need to store some numpy.ndarray into a dataset, but then when I load it out, it seems that the arrays are automatically converted to tensors, am I doing something wrong or how can I still get out arrays?
Thanks in advance!

x.numpy() turns a tensor x back into a numpy array, if that helps

Hey,
thanks, but what I meant is that if the dataset can just store ndarray as it is and not turning it into tensors, cos otherwise from np to tensor then to np is too much waste:/

The Dataset class itself does not turn your data into Tensors.
Do you use any transformations from torchvision (e.g. `ToTensor())?

Have a look at this code. I created a dummy Dataset, store a numpy array in it:

class MyDataset(Dataset):
    def __init__(self, a):
        self.data = a

    def __getitem__(self, index):
        return self.data[index]

    def __len__(self):
        return len(self.data)

a = np.random.randn(100, 10, 10)
dataset = MyDataset(a)
b = dataset[0]
print type(b)

>> <type 'numpy.ndarray'>
1 Like