Translate weight initialization from tensorflow to pytroch

How would you go about initializing this TF fully connected layer

Dense(2, activation=‘linear’, kernel_initializer=RandomNormal(mean=0.0, stddev=0.2, seed=seed),
bias_initializer=Constant(value=[3.,-3.]), name=‘pi’)

in PyTorch?

I tried this:

self.pi = nn.Linear(input_dim, 2)
nn.init.normal_(self.pi.weight, mean=0, std=0.2)
nn.init.constant_(self.pi, 0)

But I am not sure about the last line…
Thank you for your time

Hi,

About the last line, you need to initialize it like the way you have initialized your weights. In other words, use self.pi.bias instead of self.pi which is the module.

Also, nn.init.constant_(self.pi, 0) initialized the entire bias tensor with zeroes while the TF counterpart has initialized it with [3, -3]. To do so, you can simply assign a tensor with these values to the bias attribute of self.pi:

with torch.no_grad():
    self.pi.bias = nn.Parameter(torch.tensor([3., -3.]))
1 Like