"bias=False" is not working!

Hi, I’m implementing CNN for MNIST.

I make model like this,

class CNN(nn.Module):
def __init__(self):
    super(CNN,self).__init__()
    self.layer = nn.Sequential(
        nn.Conv2d(1,16,5,bias=False),
        nn.ReLU(),
        nn.Conv2d(16,32,5,bias=False),
        nn.ReLU(),
        nn.MaxPool2d(2,2),
        nn.Conv2d(32,64,5,bias=False),
        nn.ReLU(),
        nn.MaxPool2d(2,2),
    )
    self.fc_layer = nn.Sequential(
        nn.Linear(64*3*3,100,bias=False),
        nn.ReLU(),
        nn.Linear(100,10,bias=False),
    )

def forward(self,x):
    out = self.layer(x)
    out = out.view(out.size(0),-1)
    out = self.fc_layer(out)

    return out

BUT when I print parameter of fc_layer, bias parameters are still alive…

for child in net.children():
    for param in child.fc_layer[0].parameters():
        print(param)

Results is

Parameter containing:
-2.4335e-02 -2.6582e-02 -2.6456e-02 … 2.8038e-02 2.8356e-02 2.3965e-02
5.8043e-02 5.2095e-02 -1.4727e-02 … -1.2279e-02 4.1066e-02 -1.4745e-02
2.5018e-02 -3.0452e-02 -2.2104e-02 … 5.4361e-02 3.4657e-02 7.3185e-03
… ⋱ …
-1.9066e-02 -8.5296e-03 -3.2538e-02 … -5.0953e-03 2.4064e-02 -2.7636e-02
2.6700e-03 -1.1858e-02 6.5135e-03 … 1.6266e-02 2.5442e-02 3.6148e-03
-2.8540e-02 2.7211e-02 -2.3674e-02 … 4.1595e-03 -8.0679e-03 2.4592e-02
[torch.cuda.FloatTensor of size 100x576 (GPU 0)]

Parameter containing:
1.00000e-02 *
0.8742
-0.9784
1.9183
3.8764
.
.
.
-0.9190
-0.9720
2.3596
-3.0293
[torch.cuda.FloatTensor of size 100 (GPU 0)]

I DON’T want bias. How can I remove bias?

Hi,

I can’t reproduce this issue.
I guess there is something wrong in the way you create your network or in your test code. Indeed, with the CNN class that you gave, there is no child that contains fc_layer, it is directly there.
What does the following code give you?

import torch
from torch import nn

class CNN(nn.Module):
    def __init__(self):
        super(CNN,self).__init__()
        self.layer = nn.Sequential(
            nn.Conv2d(1,16,5,bias=False),
            nn.ReLU(),
            nn.Conv2d(16,32,5,bias=False),
            nn.ReLU(),
            nn.MaxPool2d(2,2),
            nn.Conv2d(32,64,5,bias=False),
            nn.ReLU(),
            nn.MaxPool2d(2,2),
        )
        self.fc_layer = nn.Sequential(
            nn.Linear(64*3*3,100,bias=False),
            nn.ReLU(),
            nn.Linear(100,10,bias=False),
        )

    def forward(self,x):
        out = self.layer(x)
        out = out.view(out.size(0),-1)
        out = self.fc_layer(out)

        return out

net = CNN()
print(net.fc_layer[0].bias)
print(net.fc_layer[2].bias)

Do the same code at other directory… haha…