How to reshape list?


How to reshape list from
[64,256,16,16] (list length : 1) → [256,16,16] * 64 (list length) ?

Could you explain your use case a bit more, please?
Is the first list a shape of a tensor or a Python list containing 4 values (which would then have a length of 4)?


I have features which have [64,256,16,16] shape.(Batch_size, channel, width, height)
and I want to append this features to the empty list.
then list_a have only one length. (whole features included in list_a[0])

But I want to make this list have 64 length and every element of this list have [256,16,16] features.

In that case, you could split the tensor:

x = torch.randn([64, 256, 16, 16])
x_list = x.split(1, 0)
> 64
> torch.Size([1, 256, 16, 16])

Thanks a lot ! It works well.
I have one more question.
I want to use data_loader with this list, then where to append the labels (shape:[64]) for this list?
x_data_loader = torchdata.DataLoader(x_list, batch_size=...)

You could pass the input tensor directly with the target tensor to a TensorDataset and pass it to a DataLoader without splitting:

x = torch.randn(64, 256, 16, 16)
y = torch.randint(0, 10, (64,))
dataset = TensorDataset(x, y)
loader = DataLoader(

Actually, I have many [64,256,16,16] features, so I want to append all features to one list and then use it for data_loader. (Maybe this approach is inefficient.)

whole_list = []
for step in data_loader:
   x # shape: [64,256,16,16]
   y # shape : [64]
   whole_list.append(x_list, labels) # It failed. How to append labels?

new_data_loader = torchdata.DataLoader(x_list, batch_size=...)

Could you explain your data shape a bit?
If you are concatenating these [64, 256, 16, 16] features, you would end up with a tensor of [N, 64, 256, 16, 16].
Would dim0 in this case refer to the batch size?
If so, you could call whole_list = torch.stack(whole_list), and pass it to a TensorDataset as shown in my example.