After pytorch fixes the random seed, why can the results be repeated only in the first few epochs

After pytorch fixes the random seed, only the results of the first few epochs can be repeated, and the results of the next few epochs cannot be repeated.
Here is my seed setup:
jj=0
np.random.seed(jj)
random.seed(jj)
torch.manual_seed(jj)
torch.cuda.manual_seed_all(jj)
torch.cuda.manual_seed(jj)
# torch.backends.cudnn.benchmark = False
torch.backends.cudnn.deterministic = True
# torch.use_deterministic_algorithms(True)
torch.backends.cudnn.enabled = False
torch.backends.cudnn.benchmark = False
os.environ[‘PYTHONHASHSEED’] = str(jj)
os.environ[‘CUBLAS_WORKSPACE_CONFIG’] = ‘:4096:8’
Furthermore,the Dataloader:
def seed_worker(worker_id):
worker_seed = torch.initial_seed() % 2 ** 32
np.random.seed(worker_seed)
random.seed(worker_seed)
data_loader= DataLoader(train_sample, batch_size=20,num_workers=0,worker_init_fn=seed_worker)

When I set epochs=10, the results of the experiment are the same no matter how many times it is repeated. When I set epochs=200, the experimental results have no deterministic characteristics.