I want to put labels and their shuffles beside each other, such as:
labels_total = [labels, labels_shuffel, labels_shuffel, ..]
I tried for loop:
labels_total = labels
For i in range(5):
labels_total = [labels_total, labels_shuffel[i]]
But this one gives a list, I used
torch.FloatTensor, it gave this error:
TypeError: expected torch.FloatTensor (got torch.LongTensor)
Then I used
torch.LongTensor, now it’s giving this error:
RuntimeError: slice() cannot be applied to a 0-dim tensor.
How can I solve it?
Actually, you are concatenating tensors. You can use following code to concatenate tensors.(This is the conventional way to concatenate tensors in pytorch)
for i in range(5):
labels_total = torch.cat((labels_total, labels_shuffel[i]))
Originally, you are getting a type error. Change that particular torch tensor to float. This should help.
Thank you for your reply @Pranavan_Theivendira.
It solved the error, but if later I want to extract all
labels_shuffels from labels_total, how can I do that?
You can slice if you know the indices. For example, you can slice the initial content of labels_total as follows. If the initial number of elements is N. You can just slice as follows.
labels_total_initial = labels_total[:N]
Likewise, you can slice other labels_shuffel contents in a similar way. For example, if you need all the label suffels. You can do the following.
labels_shuffels_prev = labels_total[N:]
Thank you for your reply, @Pranavan_Theivendira.
If I were looking for extracting each of those shuffeld ones, I mean
labels_shuffel, labels_shuffel, … how can I do that?
For that, you need the sizes of each individual shuffel sizes. For example if the shuffel size is M. You can extract shuffel as follows.
shuffel_1 = labels_total[N:N+M]
You have to use the proper indices to slice them correctly.
FYI, you did not delete the shuffel arrays/lists. They are still in memory. Hence, you can just use it.
Thank you very much for your help, @Pranavan_Theivendira.