How to load a list of numpy arrays to pytorch dataset loader?

Hi, I am a newbie and I am trying to train using the MRI2D dataset. My data set has saved 2D slices in a matrix form with the shape [500, 3, 64, 64]. When using the dataloader, I got an error: Expected 4-dimensional input for 4-dimensional weight 64 3 3 3 but got 5-dimensional input of size [4, 500, 3, 64, 64] instead.
What should I do? 500 is the number of 2Dmri pictures, and 3 is the number of channels.
I found that it reads the 4D dataset and batches it, adding a latitude. That’s why there are 4 more. What should I do to make a batch in my 500

463AC8AD-95EF-4DC1-8C4E-C6E819BF137C AA53C78C-4FD6-48BB-AA0D-A4395C9C640C

How would you like to process these MRI images?
If you want to process all 500 slices at once, you might want to use 3-dimensional layers, e.g. nn.Conv3d.
Alternatively, if you would like to use each slice as a sample, you might use the 500 slices directly as the batch dimension.

Hello, thank you for your answer.
My idea is: use the 500 slices saved by numpy as the data set, and use it as part of the batch for training. What should I do, because I found that in ‘’ ‘def __getitem __ (self, index):’“
it will iterate 500 slices directly, not from it batch, I expected to get [4, 3, 256, 256] , But it is [4,500,3,256,256]. What should i do

Are you preloading the complete dataset and are the 500 slices the complete set or do you have more files (each with 500 slices)?
If you are preloading the data, you could do it in the __init__ method of your Dataset and load each slice as a single sample in __getitem__.
I’m not sure, how ImageDataset1 works, as img_lrr and img_hrr seem to be global variables. What shapes do these arrays have?

PS: you can post code snippets by wrapping them in three backticks ``` :wink:

Yes, I have already preloaded the complete dataset.
And just now I had finished preparing the dataset through your advice.
Thank you for your answer to the discussion: learned a lot.