RuntimeError: DataLoader worker (pid 963720) is killed by signal: Segmentation fault

Hey guys! When using my cutom dataset in dataloader, i get the following error:

RuntimeError: DataLoader worker (pid 963720) is killed by signal: Segmentation fault.

It was already discussed a few times and many people could solve it by increasing the shared memory. I’m absolutely not familiar with shared memory, but if i do “sudo sysctl -a | grep shm” in terminal, i get:
“kernel.shm_next_id = -1
kernel.shm_rmid_forced = 0
kernel.shmall = 18446744073692774399
kernel.shmmax = 18446744073692774399
kernel.shmmni = 4096
vm.hugetlb_shm_group = 0”
which i would say means i already have a maximum value for share memory (i never cahnged it, is this default for linux?)…

Generally the shared memory problem would make sense in my case since the error only occurs if i use a specific transform. This transform is called vahadane normalization and uses a reference image to make all images have the same color distribution. The reference image is part of the normalizer object and could be to big for the shared memory. But how does this make sense if my shared memory is already that big? I’m confused, every help will be much appreciated <3