Hi @Arta_A,
You can’t directly create a batch of samples with different sizes because the Dataloader try to collate/glue them with torch.stack
(which requires “All tensors need to be of the same size.”).
The workaround consists in providing a custom collation function (through the Dataloader collate_fn
arg) that will (e.g.) create the batch as a Python list (or anything you want) rather than a Tensor.
Please find a detailed example in this thread How to create a dataloader with variable-size input - #3 by jdhao.