What's a GPU freindly way to keep tensors of different sizes together?

Hi,
I want to create a list that contains tensors of different sizes together. For example:
A list with two tensors
[[1,2,3],
[4,5,6,7]]

Is there any other way of keeping these tensors together apart from python list? Something more native to PyTorch…

no, a python list is the way to go. it’s not inefficient.

1 Like