Distributing one dimension over others

Hi guys,

I need to unsqueeze a tensor by one dimension provided that this dimension is added to the following dim. Can I do without for loop?

For example, A= [1,2,4,4], I need to make it [1,5,5].

Any thoughts?

Tensor A with 1 x 2 x 4 x 4 has 32 element. You cant squeeze it to 1 x 5 x 5 which will be 25 elements.

If you just want to sqeeze dim 1 somehow, A = A.mean(dim=1) gives 1 x 4 x 4 tensor

Thanks Alwyn for your prompt reply.

Okay, We can say that I need it in the form A=[1,8,4].

This what I had done with the loop,

x1=[]
for i in range(x.shape[1]):
x_t = x[:,i,:,:]
x1.append(x_t)

    x_t = torch.cat(x1, dim=1)

You can use view like below:

A =  = torch.rand((1, 2, 4, 4)) # 1 x 2 x 4 x 4
batch, channel, height, width = A.shape
A = A.view(batch, channel*height, width) # 1 x 8 x 4