So, the way you currently have it written, target['a']
is a list, not a Tensor, as you can see in the error. Because it’s a list, you can’t use any PyTorch methods on it.
Normally, if target['a']
were a list of lists, you could just call torch.tensor(target['a'])
to convert it, and from there, use PyTorch methods like permute()
. For example:
a = [1,2,3]
b = []
for i in range(3):
b.append(a)
b = torch.tensor(b)
However, your target['a']
is a list of tensors, and it seems that you can’t simply cast it to a tensor in this case.
That’s why I suggested either preallocating a large tensor, and then saving to each row. Or concatenating tensors together. According to this post, you can put a list of tensors into torch.cat()
to concatenate them together. Meaning, you can do torch.cat(target['a'])
. However, this will give you a tensor of shape [800,64,32,32]
.
Also mentioned in the post linked above is that torch.cat()
concatenates into a current dimension, while torch.stack()
stacks in a new dimension. Therefore, if you want the shape after concatenation to be [800,1,64,32,32]
, you should use torch.stack()
.
Note, both torch.cat()
and torch.stack()
allow you to specify which dimension you want things concatenated / stacked in, so you may be able to avoid needing to permute afterward altogether. I.e.:
torch.stack(target['a'],dim=2)