Why change orders of weights_init can get different weights?

Hi,

I’m trying to reproduce the result after refactoring my model’s codes. I find that the when I change orders of module initialization, e.g. changing

module1.apply(weights_init)
module2.apply(weights_init)

to

module2.apply(weights_init)
module1.apply(weights_init)

, the weights of them change either. If I run the same code again, they won’t change.
weights_init is a random initialization function like nn.init.kaiming_normal_().

But I’ve already fixed random seed like

torch.manual_seed(seed)
np.random.seed(seed)
random.seed(seed)
torch.cuda.manual_seed_all(seed)
os.environ['PYTHONHASHSEED'] = str(seed)

Is that normal because of some underterminstic cuda ops?

Update:

import torch
import torch.nn as nn
import random
import numpy as np

def set_random_seed(random_seed):
    np.random.seed(random_seed)
    random.seed(random_seed)
    if torch.cuda.is_available():
        torch.cuda.manual_seed(random_seed)
        torch.cuda.manual_seed_all(random_seed)
    torch.manual_seed(random_seed)
    torch.backends.cudnn.deterministic = True
    torch.backends.cudnn.benchmark = False

set_random_seed(0)

class Model(nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.lin1 = nn.Linear(1, 16)
        self.lin2 = nn.Linear(16, 16)
        # swap the above two lines will get different weights
        print(self.lin1.bias)

model = Model()

I usually initialize in the see in the following manner to reproduce the results in PyTorch:

# Set the random seed

def set_random_seed(random_seed):
    np.random.seed(random_seed)
    random.seed(random_seed)
    if torch.cuda.is_available():
        torch.cuda.manual_seed(random_seed)
        torch.cuda.manual_seed_all(random_seed)
    torch.manual_seed(random_seed)
    torch.backends.cudnn.deterministic = True
    torch.backends.cudnn.benchmark = False

Maybe try this?

Thanks for the fast reply. I’ve tried it but the weights still change.

I’ve added a minimal code to show this issue.