Hello. I would like to ask to append into PyTorch mechanism to control the seed of random number generators. For people who try use PyTorch from the multi-threaded application, they can face the problem that if several threads performs model initialization with randomized schemas (He initialization, Lecun initialization, etc.) it’s very likely that models this initialization is really a part of global state for which users from Python API has only access via functions like TORCH.MANUAL_SEED.
It’s better to redesign, and make random generator state as a part of the model I think.
Code snippet which demonstates problems with global states for numpy.random and Python’s random.
import numpy as np
import threading, time
def __init__(self, i, sleep_seconds): threading.Thread.__init__(self) self.th_number = i self.sleep_seconds = sleep_seconds def run(self): np.random.seed(123) random.seed(123) time.sleep(self.sleep_seconds) print(self.th_number, np.random.random(), "(np.random)") print(self.th_number, random.random(), "(random)")
th = [WorkerThread(k, 1*k) for k in range(3)]
for t in th: t.start()
for t in th: t.join()