Optimizer : TypeError: __init__() got an unexpected keyword argument 'momentum'

Hello,
I am trying to make a common train model for both MNIST and CIFAR10, the optimizer is SGD.
However, I just find the below setup only works for CIFAR10:

optimizer(self.model.parameters(), lr=cmd_lr, momentum=0.9, weight_decay=1e-4)

if using MNIST, looks like I can not set the momentum (remove it will be OK), otherwise, I will get errors like:
why does that happen? I can not figure out why the SGD for MNIST does not support momentum setup?

TypeError: __init__() got an unexpected keyword argument 'momentum'

The MNIST model is as below, which is a quite common model to be used.

class Net(nn.Module):
    def __init__(self) -> None:
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 32, 3, 1)
        self.conv2 = nn.Conv2d(32, 64, 3, 1)
        self.dropout1 = nn.Dropout(0.25)
        self.dropout2 = nn.Dropout(0.5)
        self.fc1 = nn.Linear(9216, 128)
        self.fc2 = nn.Linear(128, 10)

    def forward(self, x) -> torch.Tensor:
        x = self.conv1(x)
        x = F.relu(x)
        x = self.conv2(x)
        x = F.relu(x)
        x = F.max_pool2d(x, 2)
        x = self.dropout1(x)
        x = torch.flatten(x, 1)
        x = self.fc1(x)
        x = F.relu(x)
        x = self.dropout2(x)
        x = self.fc2(x)
        output = F.log_softmax(x, dim=1)
        return output

The initialization of the optimizer should not depend on which dataset you are using.

# Example
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)

Could you post the line where you get this error or a reproducible code to get this error?

Sorry, just take a closer examination–I actually passed the different optimizer to MNIST (Adadelta) other than to CIFAR10 (SGD) – that looks the Adadelta does not support the momentum setup…

1 Like

Just find a good article to explain what is momentum and why SGD has it and other does not.