Does nn.Sigmoid() have bias parameter?

Hello,
The question is simple,does nn.Sigmoid() have bias parameter?
If it doesn’t have bias, then how to build it by myself?

Best regards,
Alex.

It’s directly based on torch.sigmoid so no, it does not have bias parameter.
According to http://pytorch.org/docs/master/notes/extending.html,
what you can do is:

class SigmoidBias(nn.Module):
    def __init__(self, , bias=True):
        super(SigmoidBias, self).__init__()
        if bias:
            self.bias = nn.Parameter(torch.Tensor(output_features))
        else:
            # You should always register all possible parameters, but the
            # optional ones can be None if you want.
            self.register_parameter('bias', None)
        if self.bias is not None:
            self.bias.data.uniform_(-0.1, 0.1)

    def forward(self, input):
        outut = torch.sigmoid(output)
        if self.bias is not None:
            output += self.bias.unsqueeze(0).expand_as(output)
        return output
1 Like

Thank you :slight_smile:

1 Like

Hello @alexis-jacq,
I have just tried to implement your module, but it doesn’t work properly.
I get an an error:

So, i removed your SigmoidBias module:

class SigmoidBias(nn.Module):
    def __init__(self, output_features, bias=True, inplace=False):
        super(SigmoidBias, self).__init__()
        self.inplace = inplace
        if bias:
            self.bias = nn.Parameter(torch.Tensor(output_features))
        else:
            self.register_parameter('bias', None)
        if self.bias is not None:
            self.bias.data.uniform_(-0.1, 0.1)

    def forward(self, input):
        output = torch.sigmoid(input)
        if self.bias is not None:
            output += self.bias.unsqueeze(0).expand_as(output)
        return output

and it worked fine.
Could you help me to beat the problem?

That’s interesting, it’s because of the “+=” operation in the forward methode.
Apparently, x += y is not equivalent to x = x + y, but call a function that does inplace operation (on x.data).

This code worked for me:

class SigmoidBias(nn.Module):
    def __init__(self, output_features=5, bias=True):
        super(SigmoidBias, self).__init__()
        if bias:
            uniform = 0.1*(1-2*torch.rand(output_features))
            self.bias = nn.Parameter(torch.Tensor(output_features))
        else:
            self.register_parameter('bias', None)

    def forward(self, input):
        output = torch.sigmoid(input)
        if self.bias is not None:
            output = output + self.bias.unsqueeze(0).expand_as(output)
        return output
2 Likes

Brilliant, it works!
Thanks a lot.