DepthWise Convolution. Apply two different vector weights to two different input data with one conv


Before start explaining the problem I would like to clarify that I am simplifying the whole problem to make easy the question. What I am trying to do is: I have 2 vectors of 1x3x5 (batch, feat, lenght) as input data. I need to apply a different filter to each vector of size (output_channels, in_channels, lenght) in our case (12,3,5). I would like to remark that this filter is different for each of the samples. In order to apply one of this filter to one of the vectors I am using the torch.nn.functional.conv1d function as you can see below:

data = torch.randn(1,3,5)
weight = torch.randn(12,3,5)
out = torch.nn.functional.conv1d(data,weight)
torch.Size([1, 3, 5])
torch.Size([12, 3, 5])
torch.Size([1, 12, 1])

The most straighforward to apply this function for each vector taking intou account that the weight is different for each one is using a for loop, but it is inneficient. I was thinking to use the depth wise convolution using the groups parameter, but I am not sure if I have understood properly the depth wise convolution properly. Below you can see a snippet.

data_1= torch.randn(1,3,5)
data_2= torch.randn(1,3,5)
data =, data_2), dim=1)
torch.Size([1, 6, 5])
weight_1 = torch.randn(12,3,5)
weight_2 = torch.randn(12,3,5)
weight =, weight_2), dim=1)
torch.Size([12, 6, 5])
out = torch.nn.functional.conv1d(data,weight,groups=2)
Traceback (most recent call last):
File ā€œā€, line 1, in
RuntimeError: Given groups=2, weight of size [12, 6, 5], expected input[1, 6, 5] to have 12 channels, but got 6 channels instead

Basically I concatenate the output vectors in the dimension of the feature dimension and the weight vectors in the dimension of the input feature. I expected that using groups = 2 to have the same behavior as applying the convolution individually. But it is not working :frowning:
Can you tell me what Iā€™m missing? Is there anyway to implement my usecase using the current convolution implementation?

Thanks for your help,

Solved. Weights needs to be conctatenated on dim 0.