Dataloader is taking too much time to run

I made this data loader for image colorization and when i tried to run below code it is taking too much time for even one iteration. Can someone please look into this why its taking too much time

cuda = torch.device('cuda')

class DATALODER(Dataset):
	def __init__(self, root_dir, transform=None):
		self.root_dir = root_dir
		self.transform = transform

	def __len__(self):
		return 100

	def __getitem__(self, idx):
		img_name = os.path.join(self.root_dir,
		str(idx) + ".jpg")
		image = io.imread(img_name)
		gray_img = rgb2gray(image)
		lab_img = rgb2lab(image)[:,:,1:]
		sample = (gray_img, lab_img)
		sample = self.transform(sample)

		return sample


class ToTensor(object):
	def __call__(self, sample):
		gray_img, lab_img = sample
		tensor_gray = torch.tensor(gray_img, device=cuda).float()
		tensor_label = torch.tensor(lab_img.transpose((2,0,1)), device=cuda).float()

		tensor_label = tensor_label.view(1,2, tensor_gray.size()[0], tensor_gray.size()[1])
		tensor_gray = tensor_gray.view(1, 1, tensor_gray.size()[0], tensor_gray.size()[1])

		return (tensor_gray,tensor_label)

root_dir = "./images/Train/"



my_dataloader = DATALODER(root_dir, ToTensor())

train_data = DataLoader(my_dataloader, batch_size=4, num_workers=2)

for i, temp in enumerate(train_data):
	print(i)
	if i==1:
		break

Could you move the device=cuda part into the for loop?
Usually you load your data on the CPU, while the GPU is busy training your model.
Once a training iteration is done, your data batches are waiting in the queue to be transferred onto the GPU.

I meet a similar error. Can you tell me how to solve It? The parameter of batch_size=36 and num_works=8.
I try to print the Dataloader, after output 8 datas,it need much time to print next 8 datas.

It looks like the workers are not ready loading the next batches.
Are you using a HDD or SSD in your system?
Also, are you just calling the DataLoader or do you also have a workload (training) in the loop?
Usually you can mask some of the data loading time with the training procedure.