This question has been asked with slight variations but I could not really get an answer from my (simple) problem.
all_labels = torch.Tensor([]) # torch.Size([0])
for batch_idx, (images, labels) in enumerate(train_loader, 1):
all_labels = torch.cat((all_labels, labels)) #labels is torch.Size([32])
RuntimeError: Expected object of scalar type Float but got scalar type Long for sequence element 1 in sequence argument at position #1 ‘tensors’
What is happening here? I would like to understand the meaning of this error message and the reason of the problem. I basically just want to append the labels for each batch to the all_labels.
If you are calling torch.Tensor (uppercase T in Tensor), you are creating an empty FloatTensor.
You could avoid this error using by defining the empty tensor as a LongTensor:
all_labels = torch.tensor([]).long()
for _ in range(5):
all_labels = torch.cat((all_labels, torch.tensor([1])))
However, I would recommend to store the labels in a plain list and then convert it to a tensor afterwards, which should be faster than your current approach.
Based on your second code snippet it seems text is already a tensor, so I’m unsure, why you would want to use torch.cat on it again.
Anyway, the inputs to torch.cat should be a tuple of tensors, so you would have to use torch.cat((text,)).
However, this won’t change the shape of text, since it’s already a single tensor.