Initializing the bias of a convolutional layer

I want to implement a residual network, and I see that they work best if you start with an initial negative bias for the skip-connections (for example b = -1, -3, … ). My skip connections are 1x1 convolutions (since I need them for resizing) and I want to somehow initialize the biases of these layers with a negative value, for example:

self.skip_connection = nn.Conv2d(in_channels=3 , out_channels=16, kernel_size=1, stride=2, padding=0, BIAS_INITIALIZER= -3)

This does not work, since the BIAS_INITIALIZER part is taken from tensorflow, but how can I do that here?

You can access it as:

BIAS_INIT = -3
self.skip_connection.bias.data.fill_(BIAS_INIT)

@vabh Thank you for your answer, can I place this in the init method of the network?

Yes, you can do this there.
You can probably do something like:

for m in self.modules():
  if isinstance(m, nn.Conv2d) and m.kernel_size == 1:
    # bias init code 

For general weight initialisation methods, have a look at: http://pytorch.org/docs/master/nn.html#torch-nn-init

1 Like