Why does input image shape change after training neural network?

Hi, I am training Fashion-MNIST using a neural network. When loading the dataset, I specify batch size = 64, so “images” is a tensor with size (64, 1, 28, 28). However, after training the network, the image shape turns to be (32, 1, 28, 28). I am a newbie to Pytorch so I do not really understand why this happened? Could you please explain? Thanks in advance!

Below is my code:

import torch
from torchvision import datasets, transforms
import helper

# Define a transform to normalize the data
transform = transforms.Compose([transforms.ToTensor(),
                                transforms.Normalize((0.5,), (0.5,))])
# Download and load the training data
trainset = datasets.FashionMNIST('~/.pytorch/F_MNIST_data/', download=True, train=True, transform=transform)
trainloader = torch.utils.data.DataLoader(trainset, batch_size=64, shuffle=True)

# Download and load the test data
testset = datasets.FashionMNIST('~/.pytorch/F_MNIST_data/', download=True, train=False, transform=transform)
testloader = torch.utils.data.DataLoader(testset, batch_size=64, shuffle=True)

class Classifier(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(784, 256)
        self.fc2 = nn.Linear(256, 128)
        self.fc3 = nn.Linear(128, 64)
        self.fc4 = nn.Linear(64, 10)
        
    def forward(self, x):
        # make sure input tensor is flattened
        x = x.view(x.shape[0], -1)
        
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = F.relu(self.fc3(x))
        x = F.log_softmax(self.fc4(x), dim=1)
        
        return x

model = Classifier()
criterion = nn.NLLLoss()
optimizer = optim.Adam(model.parameters(), lr=0.003)

epochs = 5

for e in range(epochs):
    running_loss = 0
    for images, labels in trainloader:
        log_ps = model(images)
        loss = criterion(log_ps, labels)
        
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        
        running_loss += loss.item()
    else:
        print(f"Training loss: {running_loss/len(trainloader)}")
        print(images.shape)

Then I got the final results as follows:

Training loss: 0.3105689758049654
torch.Size([32, 1, 28, 28])
Training loss: 0.300544722255931
torch.Size([32, 1, 28, 28])
Training loss: 0.28751245076690657
torch.Size([32, 1, 28, 28])
Training loss: 0.2766977702217824
torch.Size([32, 1, 28, 28])
Training loss: 0.2701437816992879
torch.Size([32, 1, 28, 28])

This is happening because the fashion mnist dataset does not have enough images to fill 64 batch sizes for the entire dataset. Due to this the last batch has a size of 32. This does not affect your training. The else statement is just printing out the last batch. Why do you use the else: statement? Instead of doing that you should do something like this

for e in range(epochs):
    running_loss = 0
    for images, labels in trainloader:
        log_ps = model(images)
        loss = criterion(log_ps, labels)
        
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        
        running_loss += loss.item()
    
    print(f"Training loss: {running_loss/len(trainloader)}")
    print(images.shape)

Where there is no else statement and the print functions are in the epoch loop not the dataloader loop. It will still print out 32 as the batch size but like I said above this is not a problem.

1 Like

Thank you for your reply.

I used else statement as I want to print the training loss after the dataloader loop runs over all the image batches in one epoch. I find the results for both cases (with and without else statement) are similar. Could you explain why we should not use else statement here? Is it just because it make our code shorter or does it cause something wrong? Thank you!

No there really isn’t a real problem with using the else statement. It just doesn’t make sense when looking at your code if there is no if statement. Doing it like I did just putting the statements in the other loop just makes the code look better and make more sense.

1 Like