Convert a tuple into tensor

Hello, I have a NN that has a gru and a resnet. The problem is that the output of the gru is a <class ‘tuple’> but the resnet needs a tensor as an input, how can I overcome this problem?


Use numpy…asarray(tuple) to convert it to a numpy array and then convert it into tensor


Sorry to open this again, but this might be useful to some people. Going to numpy and then back to tensor might not be a good idea, especially if the tensor is placed on the GPU. Assuming the tuple is called xx, here’s how I did it when I bumped into it today:

xx  = torch.stack(list(xx), dim=0)

Writing this as a function:

def tuple_of_tensors_to_tensor(tuple_of_tensors):
    return  torch.stack(list(tuple_of_tensors), dim=0)
1 Like

what if those arrays in tuples have different dimensions? like 2x5 and 5x7 ?

I had the same issue after using gru, the tuple contains tensor, you can access the tensor simply by indexing the tuple as well.

Just use torch.tensor(tuple) to convert tuple to PyTorch Tensor.

because that returns
ValueError: only one element tensors can be converted to Python scalars

fyi this requires all tensors to be the same shape,
RuntimeError: stack expects each tensor to be equal size, but got [5, 4] at entry 0 and [2, 4] at entry 20

mine weren’t because the final batch size is a remainder

fmi, I would convert all except the last one (the remainder), then, append the last element. It would be nice to tweak the above function to handle this case.

Doing torch.tensor(tuple) will break the flow of gradients as you’re re-wrapping your Tuple object. The best way to convert from a tuple to a tensor is to use the torch.stack or Although as previously mentioned in this thread, all the tensors must be of ‘equal size’. So, it’s something you’ll have to implement on a case-by-case basis.

1 Like