Using WeightedRandomSampler for an imbalanced classes

I found an example to create a sample here and modified it to create a sampler for my data as below:

cls0 = np.zeros(224, dtype=np.int32)
cls1 = np.ones(477, dtype=np.int32)
cls2 = np.full(5027, 2, dtype=np.int32)
cls3 = np.full(4497, 3, dtype=np.int32)
cls4 = np.full(483, 4, dtype=np.int32)
cls5 = np.full(247, 5, dtype=np.int32)

target = np.hstack((cls0, cls1))
target = np.hstack((target, cls2))
target = np.hstack((target, cls3))
target = np.hstack((target, cls4))
target = np.hstack((target, cls5))

class_sample_count = np.unique(target, 
return_counts=True)[1]

weight = 1. / class_sample_count
samples_weight = weight[target]

samples_weight = torch.from_numpy(samples_weight)
sampler = WeightedRandomSampler(samples_weight, 
len(samples_weight))

I’m not sure that is correct, but with this sampler, the targets get value.

inputs, targets = next(iter(train_dl)) # Get a batch of training data
print(targets)

tensor([1, 5, 3, 4, 3, 0, 5, 2, 0, 0, 4, 1, 5, 0, 5, 5, 5, 5, 2, 5, 1, 1, 0, 3])

and the train runs, but the number of loaded data is the same as the total number of data.

total number of data = 10955
batch_size = 24
step = 10955/24 = 456

Epoch [ 1/ 2], Step [ 50, 456], Loss: 1.5504
Epoch [ 1/ 2], Step [100, 456], Loss: 1.6046
Epoch [ 1/ 2], Step [150, 456], Loss: 1.6864
Epoch [ 1/ 2], Step [200, 456], Loss: 1.6291
Epoch [ 1/ 2], Step [250, 456], Loss: 1.4469
Epoch [ 1/ 2], Step [300, 456], Loss: 1.7395
Epoch [ 1/ 2], Step [350, 456], Loss: 1.6110
Epoch [ 1/ 2], Step [400, 456], Loss: 1.4821
Epoch [ 1/ 2], Step [450, 456], Loss: 1.7239
Epoch [ 2/ 2], Step [ 50, 456], Loss: 1.3867
Epoch [ 2/ 2], Step [100, 456], Loss: 1.6165
Epoch [ 2/ 2], Step [150, 456], Loss: 1.6229
Epoch [ 2/ 2], Step [200, 456], Loss: 1.4635
Epoch [ 2/ 2], Step [250, 456], Loss: 1.5007
Epoch [ 2/ 2], Step [300, 456], Loss: 1.6607
Epoch [ 2/ 2], Step [350, 456], Loss: 1.6613
Epoch [ 2/ 2], Step [400, 456], Loss: 1.5939
Epoch [ 2/ 2], Step [450, 456], Loss: 1.4794