ValueError: too many values to unpack (expected 2) with mnist

I’m using mnist dataset in my code , but there’s an error when i run it

Traceback (most recent call last):
  File "C:\Users\Ati\Desktop\GD\", line 306, in <module>
    update_interval=update_interval, max_prob=max_prob, plot=plot, train=train, gpu=gpu)
  File "C:\Users\Ati\Desktop\GD\", line 105, in main
    images, labels =
ValueError: too many values to unpack (expected 2)

i think problem is related to this part of my code:

 transform = transforms.Compose([transforms.ToTensor(),
                              transforms.Normalize((0.5,), (0.5,)),
    root = data_path
    dataset = MNIST(root='root', download=True , transform=transform)

    trainloader =, batch_size=1, shuffle=True)
    dataiter = iter(trainloader)
    images, labels =
    images, labels = images.view(-1, 784) ,  labels

Could someone give me a solution for this problem please?


I tried your code and did not get any errors. Is there another place where you call images, labels = Also, what is your PyTorch and Torchvision versions?

And one last thing, why do you divide your images by 255 on the last line?

thanks @beaupreda for your response
I use in another part of my code

    for i in range(60000):
        label = torch.Tensor([labels[i % len(labels)]]).long()
        image = images[i % len(labels)]

by the way , when i write this print( it just shows me two arrays of 28*28 (it means for two images …while we have 60000 training examples in mnist)
Do you think i need to put it in a loop or not needed?
about last line and dividing on 255 , it was my fault that i modified it
the original code is here, if you liked to check it

What does is it gives you a B x 1 x H x W image tensor and a B label tensor, where B is the batch size specified in your dataloader. If you want to see every training example, an easier way is to simply to a for loop on the trainloader directly.

for images, labels in trainloader:
    # training loop, reshape images if you have to
    # torch.Size([1, 1, 28, 28])

Hope this helps!

1 Like