MNIST Dataloader Problem

Hi there :slight_smile:

For some time, there have been issues about downloading MNIST dataset with torchvision. I have found a solution to this from

After running this code and doing some adjustments in the folders of Google Colab, I tried to load data to a dataloader by giving its path and parameters as explained in
PyTorch beginner: CNN for MNIST | Kaggle

The code I am working on is below:

def get_train_loader(dataset):
Get train dataloader of source domain or target domain
:return: dataloader
if dataset == ‘MNIST’:
transform = transforms.Compose([
transforms.Lambda(lambda x: x.repeat(3,1,1)),
transforms.Normalize(mean= dataset_mean, std= dataset_std)
data = datasets.MNIST(root= mnist_path + ‘/processed’, train= True, transform= transform, download=False ) #This line gives the problem
dataloader = DataLoader(dataset= data, batch_size= batch_size, shuffle= True, drop_last= True)

mnist_path is a string variable that shows the root of the MNIST file
However when I run this, I get a runtime error:
RuntimeError: Dataset not found. You can use download=True to download it.

But this is not helping due to the issue I explained in the beginning. I am working on Google Colab, and I am working with MNIST and MNIST-M. I have uploaded MNIST-M to Google Colab and unzipped it. MNIST is also there, it has two subfolders processed and raw. Processed folder also contains and Can you help me solve this problem? Thank you so much :))

The folder structure sounds correct so are you sure the path is right?
Also, what kind of issue were you seeing when downloading the MNIST dataset?
In case you were hitting a server error, could you try to update torchvision to the nightly and rerun it, as another download mirror was added?

Unsure if you’ve solved this or not, but I ran into similar file i/o issues in Colab. The cleanest way I found for loading MNIST is to instead simply load it via keras, and then package it into a DataLoader. See the example here. (Caveat: Colab vomits when running the model, and as of this writing I still don’t know why).