RuntimeError: slice() cannot be applied to a 0-dim tensor

I want to put labels and their shuffles beside each other, such as:

labels_total = [labels, labels_shuffel[1], labels_shuffel[2], ..]

I tried for loop:

labels_total =  labels
For i in range(5):
       labels_total = [labels_total, labels_shuffel[i]]

But this one gives a list, I used torch.FloatTensor, it gave this error:

TypeError: expected torch.FloatTensor (got torch.LongTensor)

Then I used torch.LongTensor, now it’s giving this error:

RuntimeError: slice() cannot be applied to a 0-dim tensor.

How can I solve it?

Hi Niki,

Actually, you are concatenating tensors. You can use following code to concatenate tensors.(This is the conventional way to concatenate tensors in pytorch)

for i in range(5):
    labels_total =, labels_shuffel[i]))

Originally, you are getting a type error. Change that particular torch tensor to float. This should help.

Thank you for your reply @Pranavan_Theivendira.
It solved the error, but if later I want to extract all labels_shuffels from labels_total, how can I do that?


You can slice if you know the indices. For example, you can slice the initial content of labels_total as follows. If the initial number of elements is N. You can just slice as follows.

labels_total_initial = labels_total[:N]

Likewise, you can slice other labels_shuffel contents in a similar way. For example, if you need all the label suffels. You can do the following.

labels_shuffels_prev = labels_total[N:]


Thank you for your reply, @Pranavan_Theivendira.
If I were looking for extracting each of those shuffeld ones, I mean labels_shuffel[1], labels_shuffel[2], … how can I do that?

Hi Niki,

For that, you need the sizes of each individual shuffel sizes. For example if the shuffel[1] size is M. You can extract shuffel[1] as follows.

shuffel_1 = labels_total[N:N+M]

You have to use the proper indices to slice them correctly.

FYI, you did not delete the shuffel arrays/lists. They are still in memory. Hence, you can just use it.


1 Like

Thank you very much for your help, @Pranavan_Theivendira.