Creating the dataset kills the kernal of jupyter notebook

I am trying to read the images (CIFAR 10 = 50000 training sample) from the local machine and creating the dataset. The problem is that the kernel of Jupyter is killed abruptly. I tried to use the kernal of anaconda by running file_name.py in the terminal but I have the same issue. When I debugged the problem, the kernel was killed sometimes after loading 25000 images and sometimes after 45000 images. I would mention that I resized the images from 3232 to 224224 to fit the input of ResNet-18.

Do you think it is resources-based problem? Moreover, I did not use GPU while loading images to create a dataset because I will use GPU in the training step.

The kernel dies before it finishes this code (inside Dataset Class):

 for sub_dir in os.listdir(self.dir_path):
     files_lst = os.listdir(os.path.join(self.dir_path,sub_dir))
     for name in files_lst:
         directory = os.path.join(self.dir_path,sub_dir,name)
         image = Image.open(directory)
         image = transform(image)
         data_lst.append(image)
         target_lst.append(sub_dir)

  self.dataset = torch.stack(data_lst, 0)
  self.labels = torch.tensor(target_lst,dtype=torch.long)

The specifications of my laptop are: 11th Gen intel core vPro i9, RTX A4000 Laptop GPU, 32Gb RAM.

Thanks for any help.

I think you might be running out of host memory.
Check the memory usage via htop or run the script in a terminal to get a better error message.