How to write a simple dataloader

I want to write a simple dataloader. Can I use something like the following:
for epoch in range(MaxEpoch):
for i in range(numple of sample/batch_size):
x_input = X[i*batch_size:(i+1)*batch_size,:,:,:]
y_input = Y[i*batch_size:(i+1)*batch_size,:,:,:]
y_pred = model(x_input)

here, X.size = 16540 * 3 * 500*500 (there are totally 16540 images, each of which is of size 3 * 500 * 500) and Y.size = 16540 * 1 * 17 * 100. The numer of sample is 16540 and the batch size is 64. Then the numer of batches is 258.

Sure there’s no reason why something like this can’t work. I always prefer to just use PyTorch’s dataset/dataloading classes though.

You can use torch.utils.data.TensorDataset to wrap your X and Y and then wrap the dataloader around them.

dataset = torch.utils.data.TensorDataset(X, Y)
dataloader = torch.utils.data.DataLoader(dataset, batch_size=batch_size, num_workers=4)

for x_input, y_input in dataloader:
    y_pred = model(x)
    

The dataloader class will give you future flexibility