Standardized way to initialize layers

Hi,
I was wondering if there was a standard way to initialize various layers in PyTorch.
For example, I have used something like this for linear and conv layers (please correct me if this is incorrect) :

        def _weights_init(m):
            if isinstance(m, (nn.Conv2d, nn.Linear)):
                init.xavier_normal_(m.weight)
                # m.bias.data.zero_()
            elif isinstance(m, (nn.BatchNorm2d, nn.BatchNorm1d)):
                m.weight.data.fill_(1)
                m.bias.data.zero_()

Similarly, what is the best way to initialize recurrent layers (such as LSTMs and GRUs)?

Thanks!

Your use of torch.nn.init seems to be the standard way. For initializing the weights of LSTSMs/GRUs, use the init module on the weights/bias matrics.