How to use torch.nn.init.normal with nn.Linear

Hi, quick question: How do you use torch.nn.init.normal with nn.Linear?

Hi,

just call torch.nn.init.normal with the parameters:

l = torch.nn.Linear(5,10)
torch.nn.init.normal(l.weight)
torch.nn.init.normal(l.bias)

there are extra arguments for mean and standard deviation.
If all parameters look the same to you, you could do

for p in l.parameters():
    torch.nn.init.normal(p)

though it might not necessarily be a good idea.

Note that there are methods like xavier_normal
and kaiming_normal that attempt to set the standard variance based on the number of parameters and, if you provide one, the gain of the activation function.

Best regards

Thomas

2 Likes

When I try

class Feedforward(nn.Module):
    def __init__(self):
        super(Feedforward, self).__init__()
        prob_drop = 0.0
        self.fc1 = nn.init.normal(nn.Linear(784, 200))

I get

File "model/network.py", line 20, in __init__
    self.fc1 = torch.nn.init.normal(nn.Linear(784, 200))
AttributeError: 'module' object has no attribute 'init'

The initializer works on a parameter, not on a module.

Best regards

Thomas

1 Like