# Concatenate Tensors without Loop

I have a tensor t that I would like to transform. Therefore I made up a for loop that yields the desired result:

t = torch.tensor([[ 1,  2,  3], [ 4,  5,  6], [ 7,  8,  9], [10, 11, 12]])
N, d = t.shape
new = torch.zeros((N, N, 2*d))
for i in range(N):
for j in range(N):
new[i, j, :d] = t[i]
new[i, j, d:] = t[j]
new

>>>
tensor([[[ 1.,  2.,  3.,  1.,  2.,  3.],
[ 1.,  2.,  3.,  4.,  5.,  6.],
[ 1.,  2.,  3.,  7.,  8.,  9.],
[ 1.,  2.,  3., 10., 11., 12.]],

[[ 4.,  5.,  6.,  1.,  2.,  3.],
[ 4.,  5.,  6.,  4.,  5.,  6.],
[ 4.,  5.,  6.,  7.,  8.,  9.],
[ 4.,  5.,  6., 10., 11., 12.]],

[[ 7.,  8.,  9.,  1.,  2.,  3.],
[ 7.,  8.,  9.,  4.,  5.,  6.],
[ 7.,  8.,  9.,  7.,  8.,  9.],
[ 7.,  8.,  9., 10., 11., 12.]],

[[10., 11., 12.,  1.,  2.,  3.],
[10., 11., 12.,  4.,  5.,  6.],
[10., 11., 12.,  7.,  8.,  9.],
[10., 11., 12., 10., 11., 12.]]])

Do the for loops cause any errors with the gradient calculations? Is there any other way that concatenates the tensors in the same way without any for loops?

You could repeat the tensor t and use torch.cat to create the new tensor:

t_new = torch.cat((t.repeat(4, 1, 1).transpose(0, 1), t.repeat(4, 1, 1)), 2)
1 Like

Hi Ptrblck

I get my result but i need to concatenated predicted and target for 10-folds I use torch.stack but can not concatenate 3 tensors in a loop. I need concatenate them for each loop vertically to use them.Indeed at the end ROCTotal should be (xx,2), which ROCTotal[:,0] should be all predicted value and ROCTotal[:,1] all targeted value. It works for two tensors but not for concatenating in a loop. the size of Predictedvalue and Target is (106,1) which 106 can be different for each loop

ROCTotal=tensot.Torch()

for ii in range(1,11,1)

ROCTotalFolds=torch.stack([ROCTotalFolds,torch.from_numpy(AveragePredicted),torch.from_numpy(Target1)], dim=0)
ROCTotalFolds=torch.cat([ROCTotalFolds,torch.from_numpy(AveragePredicted),torch.from_numpy(Target1)], 1)

Could you post an example code using random tensors with the right shapes, so that we can have a look at it, as Iām currently unsure how all tensors are defined.

Dear ptrblck

I solved the problem