[SOLVED] TypeError: 'tuple' object is not callable

Does anyone know what the issue is please?

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-130-4a1832287d27> in <module>()
      1 model, criterion, optimizer = build_model()
----> 2 train_model(config, model, criterion, optimizer)
      3 
      4 if config.config_type == 'Q1_4' or config.config_type == 'Q1_5':
      5     dropout_validates = []

<ipython-input-127-85332b119cd8> in train_model(config, model, criterion, optimizer)
     24             optimizer.zero_grad()
     25             # compute loss
---> 26             loss = criterion(model(inputs), targets)
     27             loss.backward()
     28             optimizer.step()

/anaconda/envs/py36/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    355             result = self._slow_forward(*input, **kwargs)
    356         else:
--> 357             result = self.forward(*input, **kwargs)
    358         for hook in self._forward_hooks.values():
    359             hook_result = hook(self, input, result)

<ipython-input-123-94533970db9c> in forward(self, x)
      9 
     10     def forward(self, x):
---> 11         x = nn.ReLU(self.fc1(x))
     12         x = nn.ReLU(self.fc2(x))
     13         x = nn.ReLU(self.dropout(x))

TypeError: 'tuple' object is not callable

My model is defined like this:

class MLPb(nn.Module):
    def __init__(self):
        super(MLPb, self).__init__()
        self.config = config
        self.fc1 = nn.Linear(784, 600),
        self.fc2 = nn.Linear(600, 200),
        self.dropout = nn.Dropout(p=0.5), #last layer dropout
        self.fc3 = nn.Linear(200, 10)
        
    def forward(self, x):
        x = nn.ReLU(self.fc1(x))
        x = nn.ReLU(self.fc2(x))
        x = nn.ReLU(self.dropout(x))
        x = nn.Softmax(self.fc3(x))
        return x
    

EDIT: model defined with , between layers

You should initialize the nn.ReLU modules in your __init__ function and just call them in forward or use the functional API: x = F.relu(x). Now you are creating a nn.ReLU module with the arguments self.fc1(x).

EDIT: Did the comma issue solve your problem? It still gives an error for me for the aforementioned reason.

1 Like

remove commas between lines

5 Likes