Converting list to tensor

There is a variable ‘tmp’ (3 dimension).

type(tmp) -> <class 'list'>
type(tmp[0]) -> <class 'torch.Tensor'>
type(tmp[0][0]) -> <class 'torch.Tensor'>

I want to convert ‘tmp’ into torch.Tensor type.
But, when I run this code below, an error occurs.

torch.Tensor(tmp)
>> ValueError: only one element tensors can be converted to Python scalars

How can I fix this?

You could use torch.cat or torch.stack to create a tensor from the list.
Let me know, if that works for you or if you expect a specific shape.

4 Likes
import torch
tmp=[torch.rand(2,4),torch.rand(2,4)]
print(type(tmp))
print(type(tmp[0]))
print(type(tmp[0][0]))

# The full tmp
print(tmp)

# After a stack, you convert the list to a dimensions
torch.stack(tmp)

Responses

<class 'list'>
<class 'torch.Tensor'>
<class 'torch.Tensor'>
[tensor([[0.3083, 0.6300, 0.0910, 0.1102],
        [0.4926, 0.2878, 0.5105, 0.2535]]), tensor([[0.2245, 0.6037, 0.6991, 0.1598],
        [0.8157, 0.5108, 0.3417, 0.6245]])]
tensor([[[0.3083, 0.6300, 0.0910, 0.1102],
         [0.4926, 0.2878, 0.5105, 0.2535]],

        [[0.2245, 0.6037, 0.6991, 0.1598],
         [0.8157, 0.5108, 0.3417, 0.6245]]])

Hope this helps

Thanks for kind answer. But, torch.stack cannot be effective in this case because the sizes of tensors are not the same. Any solution?

Nested tensors (WIP) might be usable.
Since this feature is not implemented yet, you might need to keep the list.

Depending on your use case, you might be able to create tensors using padding or slicing.

1 Like

@ptrblck I was trying something with subclasses. Can you check whether it is in the right direction

class nestedTensor(torch.Tensor):
  def __new__(cls, x, *args, **kwargs): 
    print(super(nestedTensor,cls))
    return([super(torch.Tensor,cls).__new__(cls, y, *args, **kwargs) for y in x])
      
  def __init__(self, x): 
    print("Created nestedTensor")

tmp=nestedTensor([torch.rand(2,4),torch.rand(2,3),torch.rand(4,2)])
tmp

What would be the difference of this class to a Python list?

is nested tensors library/feature for things like what I speak here Best way to convert a list to a tensor? - #6 by Brando_Miranda though I am also interested for deeper nesting where the final element is a tensor…


I guess this works?

# %%

import torch

# trying to convert a list of tensors to a torch.tensor

x = torch.randn(3)
xs = [x.numpy(), x.numpy()]
# xs = torch.tensor(xs)
xs = torch.as_tensor(xs)

print(xs)
print(xs.size())

# %%

import torch

# trying to convert a list of tensors to a torch.tensor

x = torch.randn(3)
xs = [x.numpy(), x.numpy(), x.numpy()]
xs = [xs, xs]
# xs = torch.tensor(xs)
xs = torch.as_tensor(xs)

print(xs)
print(xs.size())

whats wrong with this solution…?

output:

import torch
# trying to convert a list of tensors to a torch.tensor
x = torch.randn(3)
xs = [x.numpy(), x.numpy(), x.numpy()]
xs = [xs, xs]
# xs = torch.tensor(xs)
xs = torch.as_tensor(xs)
print(xs)
print(xs.size())
tensor([[[0.3423, 1.6793, 0.0863],
         [0.3423, 1.6793, 0.0863],
         [0.3423, 1.6793, 0.0863]],
        [[0.3423, 1.6793, 0.0863],
         [0.3423, 1.6793, 0.0863],
         [0.3423, 1.6793, 0.0863]]])
torch.Size([2, 3, 3])

I don’t see anything wrong with your approach, but as described in the other topic, you could use torch.stack instead of transforming the tensors to numpy arrays and call torch.as_tensor.

Nested tensors would allow you to create a tensor object containing tensors with different shapes, which doesn’t seem to be the use case you are working on.

1 Like

I wish that torch.tensor(nested_list_of_tensors) gave you the corresponding tensors with many dimensions that respected the original list of tensors. Anyway, I have a small code that might be helpful here: Best way to convert a list to a tensor? - #8 by Brando_Miranda