Error in looping through the data_path

I am trying to load in MNIST dataset and although I know that there are many different ways to do so, I am having trouble in loading the dataset via the following method -

#image_location is already there in the variable path_digit[]
for i in range(10):
  print("check")
  stacked_training[i] = torch.stack([tensor(Image.open(o)) for o in (path_digit[i])]).float()/255
  print(stacked_training[i].shape, '\n')

below is the error code -

check
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-10-bb2269508aed> in <module>()
----> 4   stacked_training[i] = torch.stack([tensor(Image.open(o)) for o in (path_digit[i])]).float()/255
      5   print(stacked_training[i].shape, '\n')

RuntimeError: expand(torch.FloatTensor{[5923, 28, 28]}, size=[28, 28]): the number of sizes provided (2) must be greater or equal to the number of dimensions in the tensor (3)

How did you define path_digit?
The torchvision.datasets.MNIST dataset uses the internal data attribute to load all images from the downloaded binary files, so I’m not sure how you extracted the paths.

EDIT: Seems to be solved in this topic.

Yes this has been solved. Sorry for not editing this post.