Tensors of variable size

Assume I have a set of objects of variable size, for example cells of different size in a image, and that I don’t want to resize or pad them.
One needs to define a custom collate_fn to be able create a dataset/dataloader that returns a list of tensors of variable size. Now, here is my question:
Does torch.compile now allow users to pass/serve a list of tensors of variable size to a pytorch model, instead of the old solution:
In the transform step pad each the same size, and then concatenate them along the first dimension new dimension in the collate function, and feeding the resulting tensor to the model where the first dimension is the batch size?

You probably are interested in torch.nested — PyTorch 2.0 documentation