Standardized way to initialize layers

I was wondering if there was a standard way to initialize various layers in PyTorch.
For example, I have used something like this for linear and conv layers (please correct me if this is incorrect) :

        def _weights_init(m):
            if isinstance(m, (nn.Conv2d, nn.Linear)):
            elif isinstance(m, (nn.BatchNorm2d, nn.BatchNorm1d)):

Similarly, what is the best way to initialize recurrent layers (such as LSTMs and GRUs)?


Your use of torch.nn.init seems to be the standard way. For initializing the weights of LSTSMs/GRUs, use the init module on the weights/bias matrics.