I would like to obtain deterministic/reproducible weight initialization when working with say Con1d, while limiting the deterministic state to the weight [default] initialization part only and retaining the random state for the rest of the torch modules. Following, non-functional, code expresses the idea I am trying to achieve
This sound like a valid feature request and I think a similar one was already created. In any case I would recommend to create this feature request also on GitHub so that it can be discussed with the code owners as well.
For now you could use the same approach of passing a generator to directly initialize tensors as a workaround e.g. via:
x = torch.empty(3, 3)
gen = torch.Generator()
gen.manual_seed(2809)
x.normal_(mean=1., std=2., generator=gen)
x.normal_(mean=1., std=2.)
gen.manual_seed(2809)
x.normal_(mean=1., std=2., generator=gen)
Sounds great! After discussing the feature request in the GitHub issue, explain that you would be interested in working on this and check out the Contribution guide, this doc, and in case you are interested in a tutorial how to write your first PR, check out @tom’s great video here.