Updating Dataset Class Variables from Multiple Process

I am writing my own torch.utils.data.Dataset object, which has a variable that would be updated as the Dataset keeps loading data entries. For example, considering an example that I have 10 entries in my entire dataset and I have a list of counters (int’s) which keeps track of how many times each of the data entry is loaded during the training process (this is just an example illustrating the issue of interest). The Dataset object would probably be something like this:

class MyDataset(Dataset) :
    def __init__(self, X, y) :
        # X: list of inputs, say, with shape (10, 3, H, W) if each entry is an RGB image
        # y: list of outputs, say, with shape (10, 1) if each label is just one integer class label
        
        # Counters to keep track of how many times each item is used.
        self.counters = np.zeros(10)

        #....

    def  __getitem__(self, i):
         # Mark entry "i" being used here once
         self.counters[i] = self.counters[i]

         # other steps to load and process the data

I believe this would work when this only run on one worker/process (sorry that I have not done a thorough experiment on this). Meanwhile, as one may suspect, this would cause some problems when this Dataset object is run with multiple workers/processes, as the same class variable would be updated from multiple processes at the same time.

So I am wondering if there would be some mechanism in PyTorch that would help modifying class variables of torch.utils.data.Dataset objects when multiple workers/processes are enabled?

Thank you.