Actually, you are concatenating tensors. You can use following code to concatenate tensors.(This is the conventional way to concatenate tensors in pytorch)
for i in range(5):
labels_total = torch.cat((labels_total, labels_shuffel[i]))
Originally, you are getting a type error. Change that particular torch tensor to float. This should help.
Thank you for your reply @Pranavan_Theivendira.
It solved the error, but if later I want to extract all labels_shuffels from labels_total, how can I do that?
You can slice if you know the indices. For example, you can slice the initial content of labels_total as follows. If the initial number of elements is N. You can just slice as follows.
labels_total_initial = labels_total[:N]
Likewise, you can slice other labels_shuffel contents in a similar way. For example, if you need all the label suffels. You can do the following.
Thank you for your reply, @Pranavan_Theivendira.
If I were looking for extracting each of those shuffeld ones, I mean labels_shuffel[1], labels_shuffel[2], … how can I do that?