When I use PyTorch DataLoader to load my test data, I set the shuffle to False like this.
test_loader = DataLoader(test_set, batch_size=batch_size, shuffle=False)
It returns labels like this:
The labels are all 1 which is impossible because my data includes data from different classes(thus, different labels).
When I changed the shuffle argument of DataLoader to shuffle=True
,
test_loader = DataLoader(test_set, batch_size=batch_size, shuffle=True)
the labels are finally returned normally.
Labels: tensor([11, 0, 12, 5, 13, 16, 12, 16, 18, 7, 10, 10, 8, 18, 14, 16, 14, 3,
15, 6, 0, 10, 6, 10, 0, 18, 14, 0, 7, 10, 5, 15, 11, 7, 0, 9,
11, 13, 8, 11, 6, 16, 10, 8, 10, 18, 9, 4, 7, 10, 5, 18, 3, 12,
5, 9, 8, 6, 15, 3, 14, 12, 17, 14])
Labels: tensor([14, 14, 8, 12, 15, 7, 6, 14, 8, 9, 17, 12, 16, 0, 17, 1, 7, 2,
16, 14, 10, 15, 7, 8, 14, 16, 4, 17, 9, 15, 6, 6, 6, 18, 5, 0,
8, 10, 2, 0, 8, 6, 5, 17, 16, 18, 10, 9, 11, 7, 7, 10, 18, 7,
4, 7, 9, 4, 18, 6, 18, 6, 5, 10])
The problem is solved but I don’t understand why dataloader returns all same labels when shuffle is set to False. Can anyone explain this to me?