Setting a random seed or passing a generator to Modules using sampling

I would like to obtain deterministic/reproducible weight initialization when working with say Con1d, while limiting the deterministic state to the weight [default] initialization part only and retaining the random state for the rest of the torch modules. Following, non-functional, code expresses the idea I am trying to achieve

gen1 = torch.Generator()
conv = nn.Conv1d(16, 33, 3, stride=2)
nn.init.kaiming_uniform_(conv.weight, a=0, mode='fan_in', nonlinearity='leaky_relu', generator=gen1)

gen2 = torch.Generator()
conv2 = nn.Conv1d(16, 33, 3, stride=2)
nn.init.kaiming_uniform_(conv2.weight, a=0, mode='fan_in', nonlinearity='leaky_relu', generator=gen2)

assert conv2.weight == conv1.weight

Expected: True

Unfortunately kaiming_uniform_ does not accept a generator. Is there a way to this?

This would make sure that the sampling from the distribution is same.

This sound like a valid feature request and I think a similar one was already created. In any case I would recommend to create this feature request also on GitHub so that it can be discussed with the code owners as well.

For now you could use the same approach of passing a generator to directly initialize tensors as a workaround e.g. via:

x = torch.empty(3, 3)
gen = torch.Generator()

x.normal_(mean=1., std=2., generator=gen)

x.normal_(mean=1., std=2.)

x.normal_(mean=1., std=2., generator=gen)

Thanks! I will set this up. I can also contribute directly there, although I am not familiar with torch development guidelines :stuck_out_tongue:

Any references?

Sounds great! After discussing the feature request in the GitHub issue, explain that you would be interested in working on this and check out the Contribution guide, this doc, and in case you are interested in a tutorial how to write your first PR, check out @tom’s great video here.

1 Like

Thanks! I will check these out :slight_smile: