Random initialization of weights with torch.nn.init?

I need to write in PyTorch the equivalent to Python weights and bias:
W1 = np.random.randn(n_x, n_h) *0.01
b1 = np.zeros ((1, n_h))
While it exists torch.nn.init.zeros for the bias, I don’t find the way to set random weights and how to multiply them by a constant like the option in Python…
Here https://pytorch.org/docs/stable/nn.init.html it doesn’t exist the option…

class Net(nn.Module): 
    
    def __init__(self):
        super(Net,self).__init__()      
        self.input_layer = nn.Linear(n_x, n_h)
*#         torch.nn.init.uniform_(self.input_layer.weight)*  #how to set this to random?? 
        torch.nn.init.zeros_(self.input_layer.bias)
        
        self.hidden_layer = nn.Linear(n_h,n_y) 
*#         torch.nn.init.uniform_(self.hidden_layer.weight)*
        torch.nn.init.zeros_(self.hidden_layer.bias)

        self.output_layer = nn.Linear(n_y,n_classes) 
      
    def forward(self, x):
        x = x.view(-1, dim)
        x = F.relu(self.input_layer(x)) 
        x = F.relu(self.hidden_layer(x))
        x = self.output_layer(x)
        return F.log_softmax(x, dim=1) 

do you mean using a normal distribution, it fill tensor with random numbers from a normal distribution, with mean 0, std 1, or we could specify mean and std, something like,

import torch, torch.nn as nn, seaborn as sns
x = nn.Linear(100, 100)
nn.init.normal_(x.weight, mean=0, std=1.0)

we could also see our distribution of weight matrix,

sns.distplot(x.weight.detach().numpy())

image

oh, I found the answer myself…
It is achieved using normal, and adjusting the value of the standard deviation.
torch.nn.init.normal_(self.fc1.weight, mean=0.0, std=0.01)

I cannot see your answer :confused:

I put my answer back, I suggested same thing. :slight_smile:

1 Like