Initialising weights in nn.sequential

You could define a method to initialize the parameters for each layer via e.g.:

def weights_init(m):
    if isinstance(m, nn.Conv2d):
        torch.nn.init.xavier_uniform_(m.weight)
        torch.nn.init.zero_(m.bias)

net.apply(weights_init)

Inside this method, you could add conditions for each layer and use the appropriate weight init method.

3 Likes