# Initializing a tensor from a list of tensors?

I am trying to make a tensor from a list of tensors, which is throwing a ValueError: only one element tensors can be converted to Python scalars.

my setup is like so:

``````I have calculated three tensors n0, n1, n2
n0 = tensor([-0.4360, -1.1935, -0.8568])
n1 = tensor([ 0.8926, -0.0698,  0.5011])
n2 = tensor([-1.3286, -1.1237, -1.3579])

b = torch.tensor([[n2-n1],
[n0-n2],
[n1-n0]])
``````

produces the error
I am having trouble understanding:

• Why this way of making a tensor would be undesirable
• How to do this in a more pytorch friendly way

Thanks for any help!

I ain’t sure about the 1st question. But I would use below code for the variable `b`.

``````b = torch.cat(((n2 - n1).unsqueeze(0), (n0 - n2).unsqueeze(0), (n1 - n0).unsqueeze(0)), 0)
``````

For your first question, I think it’s because it is hard to decide whether you want to leave the input tensors as a dimension or you only want the values in it. It’ll be quite confusing.

Have you found the answer to your question? I can’t wrap my head around to get the desired result and want to know why error is coming up.