Concatenate torch tensor along given dimension

In tensorflow you can do something like this

third_tensor= tf.concat(0, [first_tensor, second_tensor])

so if first_tensor and second_tensor would be of size [5, 32,32], first dimension would be batch size, the tensor third_tensor would be of size [10, 32, 32], containing the above two, stacked on top of each other.

How can I do this with torch variables? Or ar least with torch tensors?

5 Likes

Check out torch.cat. It works both on torch tensors and variables

for example:
third_tensor = torch.cat((first_tensor, second_tensor), 0)

39 Likes

Thanks, I wrote my own function that transforms it to numpy.array and then uses numpy.concatenate, then transforms back… not sure how slow this is though.

Cool! but you won’t be able to backpropagate by converting to numpy and back (if you so desire). But torch.cat would allow torch autograd to do its “magic” :slight_smile: . Not sure about the performance issue between the two, but I guess the native way of concatenating would be faster.

9 Likes

Thanks, switched to torch.cat.

@Bixqu be also mindful of this property x = torch.cat([y, z]).view(-1)
.view is a special parameter that can change the shape of the tensor without changing its contents.