In that case you could index the `tail`

and `head`

tensors and use `torch.cat`

to create a new tensor.

Hi ptrblck,

I have extracted the features of 10 (7 head classes and 3 tail) classes, then found their mean and variance individually (for head and tail classes). After that, I merged the variance of all 10 classes to pass it to the loss function which requires the dimension in (batchsize, 2,128) form. However, mergeged variance has been transformed to a list so does not allow me to use view method i.e

all_classes=all_classes.view(bsz, 2, -1)

AttributeError: ‘list’ object has no attribute ‘view’

Could u plz help with that.

Cheers,

Angelina

`torch.cat`

will return a tensor, so it seems you haven’t used it to create `all_classes`

, have you?

Could you post an executable code snippet showing your use case and explain what results are expected?

Thank you very much ptrblck for ur guidance. Yes, it has been resolved using torch.cat.

Cheers

Hi ptblck, I have the following dimensions for the average of variance distributions of 7 head classes. According to the paper average of head classes of variance is to be transformed to each tail class. I have 3 tail classes. Could u plz guide how to assign these variances to each tail class.

Shape of average variance distribution :

Mean_of_head_class_Variance_Mean 0 Shape: torch.Size([17, 2, 128])

Mean_of_head_class_Variance_Mean 1 Shape: torch.Size([13, 2, 128])

Mean_of_head_class_Variance_Mean 2 Shape: torch.Size([16, 2, 128])

Mean_of_head_class_Variance_Mean 3 Shape: torch.Size([9, 2, 128])

Mean_of_head_class_Variance_Mean 4 Shape: torch.Size([7, 2, 128])

Mean_of_head_class_Variance_Mean 5 Shape: torch.Size([13, 2, 128])

Mean_of_head_class_Variance_Mean 6 Shape: torch.Size([10, 2, 128])

I tried different ways including following but didnt work.

for tidx,hidx in [tail_idx,head_idx]:

tail_angles_inv_Cos[tidx]=torch.stack([Mean_of_head_class_var_Mean[head_idx]],dim=0)

Hi ptrblck,

I have 10 classes(indices or labels 0…9) with different shapes i.e

class 1 to class 10 shapes: [14, 2, 128] , [16, 2, 128], [15, 2, 128], [20, 2, 128],[15, 2, 128],[20, 2, 128], [10, 2, 128], [11, 2, 128], [7, 2, 128]

I want to create a label tensor of 128 batch size using the first value from each shape tensor so that label tensor consist of 14 values of index 0 , 16 values of index 1,

15 values of index 2 and likewise.

Any help in this regard will be highly appreciated.

Cheers,

Angelina

I’m not sure which dimension you would like to index, but assuming you have the 10 class tensors as:

```
class0 = torch.randn(14, 2, 128)
class1 = torch.randn(16, 2, 128)
```

and want to get the 14 values from `class0`

at index0 in `dim1`

and `dim2`

, you could use:

```
result = torch.cat((class0[:, 0, 0], class1(:, 0, 0), ...))
```

Thanx a lot ptrblck.

Cheers

Hi, ptrblck,

I am passing the variance of 10 classes to loss function it generates the following error even the dimension of each tensor is correct. It works fine when features of same dimensions are passed to loss function .

Error is:

all_classes=torch.cat([head_class_angle_mean_var[0],head_class_angle_mean_var[1], head_class_angle_mean_var[2], head_class_angle_mean_var[3], head_class_angle_mean_var[4], head_class_angle_mean_var[5], head_class_angle_mean_var[6], tail_class_angle_mean_var[7],tail_class_angle_mean_var[8],tail_class_angle_mean_var[9]])

**RuntimeError: invalid argument 0: Tensors must have same number of dimensions: got 3 and 1 at /opt/conda/conda-bld/pytorch_1565272271120/work/aten/src/THC/generic/THCTensorMath.cu:62**

Cheers

Could you print the shapes of all inputs to `torch.cat`

and post these shapes here?

Shapes of all features (all_classes) and labels are:

head_class_angle_mean_var[0] shape : torch.Size([34, 2, 128])

head_class_angle_mean_var[1] shape : torch.Size([20, 2, 128])

head_class_angle_mean_var[2] shape : torch.Size([20, 2, 128])

head_class_angle_mean_var[3] shape : torch.Size([13, 2, 128])

head_class_angle_mean_var[4] shape : torch.Size([12, 2, 128])

head_class_angle_mean_var[5] shape : torch.Size([12, 2, 128])

head_class_angle_mean_var[6] shape : torch.Size([7, 2, 128])

head_class_angle_mean_var[7] shape : torch.Size([4, 2, 128])

head_class_angle_mean_var[8] shape : torch.Size([3, 2, 128])

head_class_angle_mean_var[9] shape : torch.Size([3, 2, 128])

**all_classes shape torch.Size([128, 2, 128])**

**Length of Labels : 128**

I have also used ceil_mode=True, which doesn’t work either.

The provided shapes work fine, so I guess your code might be different to the one posted here.

```
head_class_angle_mean_var = []
head_class_angle_mean_var.append(torch.randn([34, 2, 128]))
head_class_angle_mean_var.append(torch.randn([20, 2, 128]))
head_class_angle_mean_var.append(torch.randn([20, 2, 128]))
head_class_angle_mean_var.append(torch.randn([13, 2, 128]))
head_class_angle_mean_var.append(torch.randn([12, 2, 128]))
head_class_angle_mean_var.append(torch.randn([12, 2, 128]))
head_class_angle_mean_var.append(torch.randn([7, 2, 128]))
head_class_angle_mean_var.append(torch.randn([4, 2, 128]))
head_class_angle_mean_var.append(torch.randn([3, 2, 128]))
head_class_angle_mean_var.append(torch.randn([3, 2, 128]))
for idx, h in enumerate(head_class_angle_mean_var):
print(idx, h.shape)
> 0 torch.Size([34, 2, 128])
1 torch.Size([20, 2, 128])
2 torch.Size([20, 2, 128])
3 torch.Size([13, 2, 128])
4 torch.Size([12, 2, 128])
5 torch.Size([12, 2, 128])
6 torch.Size([7, 2, 128])
7 torch.Size([4, 2, 128])
8 torch.Size([3, 2, 128])
9 torch.Size([3, 2, 128])
res = torch.cat(head_class_angle_mean_var)
print(res.shape)
> torch.Size([128, 2, 128])
```

Hi ptrblck,

Thanks for ur reply, my code to concate the different variances is given below:

all_classes=torch.cat([head_class_angle_mean_var[0],head_class_angle_mean_var[1], head_class_angle_mean_var[2], head_class_angle_mean_var[3], head_class_angle_mean_var[4], head_class_angle_mean_var[5], head_class_angle_mean_var[6], tail_class_angle_mean_var[7],tail_class_angle_mean_var[8],tail_class_angle_mean_var[9]])

I have also used your code (given below). In both approaches the final size of variance is ([128,2,128]) and the error generated is also same (given in the bottom). Could, you please help with that.

head_class_angle_mean_var2 = []

head_class_angle_mean_var2.append(head_class_angle_mean_var[0])

head_class_angle_mean_var2.append(head_class_angle_mean_var[1])

head_class_angle_mean_var2.append(head_class_angle_mean_var[2])

head_class_angle_mean_var2.append(head_class_angle_mean_var[3])

head_class_angle_mean_var2.append(head_class_angle_mean_var[4])

head_class_angle_mean_var2.append(head_class_angle_mean_var[5])

head_class_angle_mean_var2.append(head_class_angle_mean_var[6])

head_class_angle_mean_var2.append(tail_class_angle_mean_var[7])

head_class_angle_mean_var2.append(tail_class_angle_mean_var[8])

head_class_angle_mean_var2.append(tail_class_angle_mean_var[9])

all_classes2 = torch.cat(head_class_angle_mean_var2)

**Error is:**

```
**all_classes2 = torch.cat(head_class_angle_mean_var2)**
```

**RuntimeError: invalid argument 0: Tensors must have same number of dimensions: got 3 and 1 at /opt/conda/conda-bld/pytorch_1565272271120/work/aten/src/THC/generic/THCTensorMath.cu:62**

Cheers