AttributeError: 'NoneType' object has no attribute 'log_softmax'

i’m getting this error from criterion, it says that inputs is none which is strange since code works for fully connected layer.

here is the full project https://github.com/kanedaaaa/Pytorch-MNIST-fully-connected-nns

you can find cnn.py in Models folder and data/train/test in main.py

code snippets:

class CnnNet(nn.Module):
  def __init__(self):
    super(CnnNet, self).__init__()
    self.conv1 = nn.Conv2d(in_channels=1, out_channels=32, kernel_size=3, padding=1)  
    self.conv2 = nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, padding=1) 
    self.pool = nn.MaxPool2d(2,2)
    self.fc1 = nn.Linear(64*7*7, 128) 
    self.fc2 = nn.Linear(128, 10)
    self.dropout = torch.nn.Dropout(p=0.5)
    self.relu = torch.nn.ReLU()

  def forward(self, x):
    x = self.conv1(x)
    x = self.relu(x)
    x = self.pool(x)
    x = self.conv2(x) 
    x = self.relu(x)
    x = self.pool(x) 
    x = x.reshape(x.size(0), -1) 
    x = self.fc1(x) 
    x = self.dropout(x)

    prediction = self.fc2(x) 
def train():
	for epoch in range(num_epochs):
		running_loss = 0
		for i, (inputs, labels) in enumerate(trainloader):
			#inputs, labels = inputs.reshape(-1, 28*28).to(device), labels.to(device)
			inputs, labels = inputs.to(device), labels.to(device)

			optimizer.zero_grad()

			output= net(inputs)
			loss = criterion(output, labels)
			
			loss.backward()
			optimizer.step()

			running_loss =+ loss.item() * inputs.size(0)

			if(i+1) % 100 == 0:
				print('Epoch [{}/{}], Step [{}/{}], Loss: {:.4f}'.format(epoch+1, num_epochs, i+1, total_step, loss.item()))

		loss_values.append(running_loss / total_step)

btw, fully connected layer needs reshaped inputs.

1 Like

Hi, you forgot to return prediction. Your forward returns nothing.

Bests

2 Likes

Moral of the story: Dont write code at 5am

2 Likes