What's a GPU freindly way to keep tensors of different sizes together?

I want to create a list that contains tensors of different sizes together. For example:
A list with two tensors

Is there any other way of keeping these tensors together apart from python list? Something more native to PyTorch…

no, a python list is the way to go. it’s not inefficient.

1 Like