How to load labels a small dataset in a single batch Pytorch

At the moment, I just want labels of a dataset, Whenever i try to load the labels in following way,


I get error PID was killed. Any solution for this ?

The length of trainloader is 1.
I try to iterate through it once and get all the labels of dataset.
I get error PID “Any number 160/240” was killed.

This clearly shouldn’t exceed memory limit since its mere labels.

Hi,

The size of trainloader has to be 1 as you defined batch_size=len(dataset).
Your code is completely correct. Are you using custom datasets? Obviously, the problem is not from the code you have mentioned.

Here is small snippet that demonstrates your approach:

x = torch.randint(0, 2, size=(10, 3, 2, 2)).float()
y = torch.randint(0,2, size=(10, 1))
dataset = data.TensorDataset(x, y)
trainloader = data.DataLoader(dataset, batch_size=len(dataset), shuffle=True, )

temp=iter(trainloader)
labels = next(temp)[1]
labels.shape  # torch.Size([10, 1])

Bests