Hi Peter,

Is it possible that the `samples_weight`

size is 128 x128 since I’m dealing with image dense prediction.

So every target image size is 128 x128. I have 3300 image pairs in total. (3300 original images, 3300 segmentation mask). So in this case, to deal with the class imbalance, I have done the concatenation of the target mask to find the count and class label:

```
for i in range(len(total_data)):
sample = total_data[i]
mask = sample['parc1a'].float()
mask = mask.to(device)
mask_total = torch.cat((mask_total, mask))
```

`unique_color, count = np.unique(mask_total.cpu(), return_counts = True)`

I have 171 classes in total.

I’m confused how to construct the sampler in my case

This `samples_weight = torch.tensor([weight[t] for t in target])`

doesn’t work for me since my target size is `[422400, 128]`

so it can’t be an index