Hi,
I have 2 categories of classes Category 1 and Category 2. Category 1 contains 6 classes and category 2 contains 4 classes I have extracted the features of both categories with dimensions of 3,128 and features of category 1 classes are stored at index=0,1,…5 while features of category2 classes are stored at index=6,7,8,9. Could u plz guide how to combine the features of all classes in a single variable?
It should look be like:
all_features=f1,f2,…f10
Cheers
I’m not sure how the mentioned shape of [3, 128]
fits the indices in [0, 10]
from the description.
However, if you want to concatenate tensors, you could use torch.cat
or torch.stack
, if you want to add a new dimension.
1 Like
Hi ptrblck,
Using troch.stack as:
all_classes=torch.stack( head_features_var[0], head_features_var[1], head_features_var[2], head_features_var[3], head_features_var[4], head_features_var[5], head_features_var[6], tail_features_mean_var[7],tail_features_mean_var[8],tail_features_mean_var[9])
generates the following error:
TypeError: stack() takes from 1 to 2 positional arguments but 10 were given
Try to wrap it in a tuple
or list
as:
all_classes=torch.stack((head_features_var[0], head_features_var[1], head_features_var[2], head_features_var[3], head_features_var[4], head_features_var[5], head_features_var[6], tail_features_mean_var[7],tail_features_mean_var[8],tail_features_mean_var[9]))
1 Like