Turning off batch_size of DataLoader

This is counter-intuitive but there is a reason. I know parallel processing through batches is what makes DataLoaders great.

I designed a general purpose (I thought) method that accepts an autoencoder and a DataLoader and trains it.

Now I’m working on a new model that exclusively uses LSTMCells, which it looks like don’t accept batches. So instead of re-writing my train_encoder(model, DataLoader) method with a huge if-else block, is there a simple way to turn off batching completely?

I’m not sure I understand the use case completely, but you could set batch_size=1 to get single samples in each batch (and remove the batch dimension if needed).
Would this work for you? If not, could you explain what “removing batching” would mean?

Hey @ptrblck, I just used your suggestion of removing batch size. Thank you!