Good way of initializing weights

I want to initialize a network which contains linear layers, convolutional layers, pooling and so on.
My current method of initialization looks like:

params = list(model.parameters())     
    for i in range(0,len(params)):
        val = params[i].data.ndim
        if (val > 1):
            torch.nn.init.xavier_uniform_(params[i])

Is this a good way of initializing the weights or is there a better general method?

1 Like

Your method may work with some networks (though it should handle 1d parameters too) but it is more usual to delegate initialization to submodules for more precise control, as other nn.init initializers all may have their use, depending on distribution of submodule inputs (or preceding “activation function”).

Should have mentioned the obvious: torch.nn built-in modules initialize their own parameters (with initializers hardcoded in reset_parameters()), so your code is only useful if you’re not happy with default initializers, or create parameters manually, but then again - you would probably want to fine-tune them locally.