DataLoader overrides tensor type?

Hi all,
I am just starting using PyTorch, and I am running into the following issue (which ultimately results in a “TypeError: torch.addmm received an invalid combination of arguments” when I run it with a network).

The problem is that while I can set the data type to float, when transforming a numpy array into torch, it seems that this type gets reset to double by the dataloader for the minibatches (i.e. mini-batch types do not correspond to original types - see example below). What is the correct way, if I want the types to be the same for mini-batch data and labels?

import sklearn.datasets
boston = sklearn.datasets.load_boston()
x = torch.from_numpy(
y = torch.from_numpy(
dataset =,y)
dataloader =, batch_size=5, shuffle=True)
for x_mini, y_mini in dataloader:

This outputs:

    <class 'torch.FloatTensor'>
    <class 'torch.FloatTensor'>
    <class 'torch.FloatTensor'>
    <class 'torch.DoubleTensor'>

(I would expect 4x the same output <class 'torch.FloatTensor'>)

Try to reshape your target:

y = torch.from_numpy(, 1)).float()

Because your target is 1-dimensional, Pytorch casts the elements to DoubleTensors.

1 Like