Cannot figure out how to resolve AssertionError

I’ve been trying to figure out how to resolve this error for a couple of hours to no luck. I’ve combed through the documentation, but couldn’t find anything. Can someone explain to me why the AssertionError is being thrown? Here’s the code:

import torch
import torch.nn as nn
import torchvision.transforms as transforms
from torch.autograd import Variable

num_epochs = 15
batch_size = 500
learning_rate = 0.003

class CNN(nn.Module):
    def __init__(self):
        super(CNN, self).__init__()
        self.layer1 = nn.Sequential(
            nn.Conv2d(1, 16, kernel_size=5),
            nn.BatchNorm2d(16),
            nn.ReLU(),
            nn.MaxPool2d(2))

	def forward(self, x):
        out = self.layer1(x)
        return out

cnn = CNN()

criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(cnn.parameters(), lr=learning_rate)

img = torch.Tensor(img)
#img & labl are numpy ndarrays
labl = torch.Tensor(labl)
image = Variable(img)
label = Variable(labl)

optimizer.zero_grad()
output = cnn(image)

Here’s the error message:

Traceback (most recent call last):
  File "conv_net.py", line 84, in <module>
    output = cnn(image)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/modules/module.py", line 202, in __call__
    result = self.forward(*input, **kwargs)
  File "conv_net.py", line 50, in forward
    out = self.layer1(x)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/modules/module.py", line 202, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/modules/container.py", line 64, in forward
    input = module(input)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/modules/module.py", line 202, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/modules/conv.py", line 237, in forward
    self.padding, self.dilation, self.groups)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/functional.py", line 38, in conv2d
    return f(input, weight, bias) if bias is not None else f(input, weight)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/_functions/conv.py", line 35, in forward
    output = self._update_output(input, weight, bias)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/_functions/conv.py", line 99, in _update_output
    output = self._thnn('update_output', input, weight, bias)
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/_functions/conv.py", line 159, in _thnn
    impl = _thnn_convs[self.thnn_class_name(input)]
  File "/home/randy/.virtualenv/ml-pytorch/local/lib/python2.7/site-packages/torch/nn/_functions/conv.py", line 140, in thnn_class_name
    assert input.dim() == 4 or input.dim() == 5
AssertionError

nn.Conv2d expects a dimension for a batch size in the input. So you have to feed your network with a tensor of dimensions (batch_size, nb_channels, height, weight).

You can simply add a fake batch dimension with unsqueeze():

img = torch.Tensor(img).unsqueeze(0)