How to train with batches?

I am experimenting with training word embeddings so here’s how it is going on. I have input nxn matrix which is n words of n lengths. So through my model, I get an output of nx2 where 2 is the dimension of each word embedding. My training runs perfectly fine with nll loss. So now if I want to do say training of 100 such nxn inputs, how do I implement it in Pytorch? I’ve read about custom dataset and dataloader, however my doubt is regarding the batches. Do I have to change the shape of my tensors to include batch size across all my implementation? Or do I just give 100xnxn as input to the dataset? What I want is that my model gets an input of n*n from the dataloader. How is that possible?

I’m unsure if the n dimension would already refer to the batch dimension or if it would be a seperate dimension (e.g. a temporal dimension).
Since PyTorch models expect inputs already containing a batch dimension, I would guess that the first dimension is already treated as the batch dim.
Depending on the used layers, you could use a DataLoader, specify a batch_size, and pass the data with the additional batch dimension to the model and check, if shape mismatch errors would be raised.

Thanks for your response. n is a separate dimension.
I have created a custom model and I’m not using any pytorch’s defined model. I haven’t implemented batch size in my custom model. So my guess I would have to add it there, right?