I created a dataset class that alternatively returns data elements from two different datasets I have. the return type of
__getitem__ function is tuple of numpy arrays
When I use a batch size 2 and it stacks the two outputs along the 0th dimension (batch dimension) I get the following error.
RuntimeError: Expected a Tensor of type torch.FloatTensor but found a type torch.DoubleTensor for sequence element 1 in sequence argument at position #1 'tensors'
This doesnt make sense as I used only numpy arrays, so FloatTensor and DoubleTensor is probably something the dataloader does and I am not sure why I am getting the error
Could you provide a small code snippet reproducing this error? It sounds indeed like a wrong error message!
Sorry my bad, I thought correcting shape of dataloader solved the problem but the error came back.
Also I don’t understand the reason, my
__getitem__ function returns tuple of numpy arrays, I dont understand why would error about FloatTensor and DoubleTensor would come up
I checked with separate dataloaders for the separate dataset classes whom I call inside my fused dataset class.
There is indeed a mismatch, in first dataloader a 5D numpy array is converted to torch.FloatTensor while other it is torch.DoubleTensor
solved, set all numpy array’s type to double via astype function