# How to concatenate a matrix in channle way by using `view`?

It’s hard to descript, so let’s see the example. I get a tensor like below

``````[[[1,1],[1,1]],
[[2,2],[2,2]],
[[3,3],[3,3]],
[[4,4],[4,4]]]
``````

the size is `(4,2,2)`. I want to concat diffrent channel and convert it into a new tensor with size `(1,4,4)`, but if using `view` it will like the left array, but I want to get the right format.

``````[[1,1,1,1],             [[1,1,2,2],
[2,2,2,2],              [1,1,2,2],
[3,3,3,3],              [3,3,4,4],
[4,4,4,4]]              [3,3,4,4]]
``````

Actually I would like a tensor whose size is `(N,C,H,W)` be converted into `(N, C//4, H*2, W*2)` in right format above format by using `view` instead using loop.
What can I do? really appreciate your help!

This permutation should work:

``````x = torch.tensor([[[1,1],[1,1]],
[[2,2],[2,2]],
[[3,3],[3,3]],
[[4,4],[4,4]]])

print(x.view(2, 2, 2, 2).permute(0, 2, 1, 3).contiguous().view(1, 4, 4))
> tensor([[[1, 1, 2, 2],
[1, 1, 2, 2],
[3, 3, 4, 4],
[3, 3, 4, 4]]])
``````

Since you have repeated values, make sure that the values are indeed at the right position.

I get what @ptrblck mean. here’s my new example

``````       # first element of batch
[[[[1.1,1.2],[1.3,1.4]],
[[2.1,2.2],[2.3,2.4]],
[[3.1,3.2],[3.3,3.4]],
[[4.1,4.2],[4.3,4.4]]],
#  second element of batch
[[[5.1,5.2],[5.3,5.4]],
[[6.1,6.2],[6.3,6.4]],
[[7.1,7.2],[7.3,7.4]],
[[8.1,8.2],[8.3,8.4]]]]
``````

And if I use

``````N,C,H,W = a.size()
x.view(N,C//2,C//2,H,W).permute(0,1,3,2,4).contiguous().view(N,C//4,H*2,W*2)
``````

it will return

``````tensor([[[[1.1000, 1.2000, 2.1000, 2.2000],
[1.3000, 1.4000, 2.3000, 2.4000],
[3.1000, 3.2000, 4.1000, 4.2000],
[3.3000, 3.4000, 4.3000, 4.4000]]],
[[[5.1000, 5.2000, 6.1000, 6.2000],
[5.3000, 5.4000, 6.3000, 6.4000],
[7.1000, 7.2000, 8.1000, 8.2000],
[7.3000, 7.4000, 8.3000, 8.4000]]]])
``````

sorry for bothering, there is another question.
Now I want to get a new format.left is my input, right is my expected output.

``````# first element of batch
[[[[1.1,1.2],[1.3,1.4]],                       [[[1.1, 2.1, 3.1, 4.1],
[[2.1,2.2],[2.3,2.4]],                          [[1.2, 2.2, 3.2, 4.2],
[[3.1,3.2],[3.3,3.4]],                          [[1.3, 2.3, 3.3, 4.3],
[[4.1,4.2],[4.3,4.4]]],                         [[1.4, 2.4, 3.4, 4.4]],
#  second element of batch
[[[5.1,5.2],[5.3,5.4]],                          [[5.1, 6.1, 7.1, 8.1],
[[6.1,6.2],[6.3,6.4]],                           [5.2, 6.2, 7.2, 8.3],
[[7.1,7.2],[7.3,7.4]],                           [5.3, 6.3, 7.3, 8.3],
[[8.1,8.2],[8.3,8.4]]]]                          [5.4, 6.4, 7.4, 8.4]]]
``````

`F.pixel_shuffle` is similar but my input size is `(N, C*4, H, W)` and convert it into `(N, C, H*2, W*2)` instead of `(N, R**2, H, W)` into `(N, R, H*R, W*R)`
relly thank you for your help!

This should work:

``````x = torch.tensor([[[[1.1,1.2],[1.3,1.4]],
[[2.1,2.2],[2.3,2.4]],
[[3.1,3.2],[3.3,3.4]],
[[4.1,4.2],[4.3,4.4]]],
[[[5.1,5.2],[5.3,5.4]],
[[6.1,6.2],[6.3,6.4]],
[[7.1,7.2],[7.3,7.4]],
[[8.1,8.2],[8.3,8.4]]]])

print(x.permute(0, 2, 3, 1).view(2, 4, 4))
> tensor([[[1.1000, 2.1000, 3.1000, 4.1000],
[1.2000, 2.2000, 3.2000, 4.2000],
[1.3000, 2.3000, 3.3000, 4.3000],
[1.4000, 2.4000, 3.4000, 4.4000]],

[[5.1000, 6.1000, 7.1000, 8.1000],
[5.2000, 6.2000, 7.2000, 8.2000],
[5.3000, 6.3000, 7.3000, 8.3000],
[5.4000, 6.4000, 7.4000, 8.4000]]])
``````

you are the best! @ptrblck
But is there need a `contiguous()` after `permute`. And how you work it out？right now I regard the array as a one-dimension list and then reshape it.
Anyway it really helps! thank you very much!

I add the `contiguous()` calls, if the next `view` operation raises an error. Otherwise I assume PyTorch does the right thing and doesn’t need to create a contiguous copy of the tensor.

Do you mean, how do I come up with the solution?
If so, I don’t think there is a better way than having looked at various ways to reshape tensors in the past and just staring long enough at it until you realize which dimension has to move where.
At least I’m not using any formal way and just trust my gut feeling.

I get it! relly thanks!