Dataloaders with variable batch size and accessing specfic features in a dataloader

We are training a Physics informed deep learning network. We have a time series dataset and want to give data in batches, and our dataset consists of five excel sheets with different lengths. We want to use four excel sheets as training and a fifth excel sheet as a test dataset. We are doing this because these experiments are performed sequentially; we can not give them randomly to our network. I need help with these two questions.
How can we make data loaders with variable batch sizes?
How can we access specific columns and features in the data loader?

We want to do batch training, for that we need to to give our data in the form of batches, but we need to extract specific columns from dataloader as we are training physics informed neural network.

t_train = torch.tensor(train[:, :, 0])  
u_train = torch.tensor(train[:, :, 1:-1]) 
x_train = torch.tensor(train[:, :, 3])            
t_test = torch.tensor(test[:, :, 0])  
u_test = torch.tensor(test[:, :, 1:-1])  
x_test = torch.tensor(test[:, :, 3]) 

I appreciate any help you can provide. Thank you.