# Nested lists of Tensors to Tensor

Hello. I have a deeply nested list of tensors. It’s returned by an external library which was previously numpy based, which I modified to convert numpy to torch:

``````a = list(self.nsgt.forward((x,)))

print(type(a))
print(len(a))
print(type(a))
print(len(a))
print(type(a))
print(len(a))
print(type(a))
print(len(a))
print(a.device)
``````

Results in:

``````class 'list'
59
<class 'list'>
2
<class 'list'>
126
<class 'torch.Tensor'>
304
cuda:0
``````

I would like to convert this into a tensor with the following shape:

``````(59, 2, 126, 304)
``````

Previously, when this was an np ndarray, achieving what I needed was trivial:

``````A = np.asarray(a)
print(A.shape)
print(A.dtype)
``````

This would result in the correct ndarray:

``````(59, 2, 126, 304)
float32
``````

In torch, I’m having trouble achieving the same with `torch.tensor` or `torch.stack`.

torch.tensor issues:

``````A = torch.tensor(a)
ValueError: only one element tensors can be converted to Python scalars
``````

torch.stack issue:

``````A = torch.stack((a))
TypeError: expected Tensor as element 0 in argument 0, but got list
``````

For PyTorch you would need to use nested stacks or a for loop

``````t = torch.stack([torch.stack([torch.stack(l2, dim=0) for l2 in l1], dim=0) for l1 in a], dim=0)
``````

or somesuch (I might have gotten the brackets wrong). But this will copy the tensors several times, so an alternative is just allocate and copy in a loop:

``````A = torch.empty(len(a), len(a), len(a), a.size(0))
for i, aa in enumerate(a):
assert len(aa) == A.size(1)
for j, aaa in enumerate(aa):
assert len(aaa) == A.size(2)
for k, aaaa in enumerate(aaa):
assert len(aaa) == A.size(3)
A[i, j, k] = aaaa
``````

This will have a lot of looping but only copies data once. Once you do something interesting with the data, you might not care as much about the Python overhead of this even if you heard that Python loops are slow. Best regards

Tomas

1 Like

Perfect. I ended up using the for loop solution.

I thought of doing something similar, but I had the impression that tensors were immutable.