Oversampling the minor classes(only single sample for two classes)

Hi,
I have a imbalance dataset = [3, 41, 29, 7, 14, 20, 1, 15, 1, 36, 64, 22, 4, 28, 10, 2]
I want to make a dataloader which can generate equal number of samples for for each class (e.g. 4*16=64 batch_size). I have used WeightedRandomSampler but I was not getting the exact results.
The demo code is given below:

count function

def count_(x):
x_unique = x.unique(sorted=True)
x_unique_count = torch.stack([(x==x_u).sum() for x_u in x_unique])
return x_unique_count

train_weights = 1./torch.as_tensor(count, dtype=torch.double) ##### torch.double
train_sampleweights = train_weights[train_dataset.target]
train_sampler = WeightedRandomSampler(weights=train_sampleweights, num_samples = len(train_sampleweights))

dataloader

balance_loader = DataLoader(train_dataset,
sampler=train_sampler,
batch_size=64)
for i,(data, target_) in enumerate(balance_loader):
#print(target)
count = count_(target_)
print(torch.sum(count))
print(count)

###results
tensor([3, 2, 4, 4, 5, 5, 4, 4, 3, 5, 5, 2, 3, 4, 9, 2])
tensor([4, 4, 5, 7, 1, 3, 6, 7, 5, 2, 4, 4, 4, 5, 3])
tensor([3, 9, 6, 5, 2, 2, 4, 2, 4, 7, 3, 1, 5, 5, 2, 4])
tensor([3, 3, 6, 3, 2, 4, 5, 2, 8, 3, 2, 7, 4, 6, 4, 2])
tensor([2, 1, 2, 3, 5, 7, 2, 3, 4, 2, 3, 2, 4, 1, 2])