Hi. I’d like to concatenate two Variables, which are each an output of a nn module.
Say I have Variables
I could use
torch.cat([v1, v2]) in my python interactive mode, but when I try to write a code and run it, it gives error:
TypeError: cat received an invalid combination of arguments - got (tuple, int), but expected one of:
- (sequence[torch.cuda.FloatTensor] tensors)
- (sequence[torch.cuda.FloatTensor] tensors, int dim)
didn’t match because some of the arguments have invalid types: (tuple, int)
How should I concatenate two Variables?
(I’d like to concat and feed it to another fully connected layer)
I guess v1 and v2 have the different types(i.e. torch.cuda.FloatTensor and torch.FloatTensor)–that’s the problem they need to have the same type.
Yeap. You were right.
Thanks for answering my dumb question
Helped me too.
It seems like the error message is incorrect. My problem was that I tried to cat a FloatTensor and a LongTensor but the error was mentioning a tuple type.
I have got the same problem. However, printing
type(v2) results in
<class 'torch.autograd.variable.Variable'> for both cases. Is it not possible to concatenate Variables?
type(v1), instead use
The former calls Python’s built-in function and only tells you the class, i.e Pytorch
torch.autograd.variable.Variable as you posted. The answers here are referring to the
data type of the
Variable, which is what the latter call above returns. You can have the same data types for Variables as for Tensors in PyTorch and if you try to concatenate (or most other operations) two Variables of different types, you get the error above.
v1.type() will not work on Variables. One has to use
Also, it is recommended to use
However, my problem was, that one tensor was on the GPU whereas the other one was on the CPU.
In my case, the type of the concatenated tensor is both of “torch.DoubleTensor”.
Converting the type to FloatTensor by using
I feel something odd from this error message.
I also encountered this problem, but the two variables’ type are both cuda.FloatTensor.
anyone know why? thanks~