Batch-wise F.conv2d Convolution

Hi, I am trying to implement a convolution using F.conv2d batch-wise. Below is my current implementation with using a for loop. I was wondering whether there are any ways to avoid using the for loop here?

import torch.nn.functional as F

B = torch.randn(50, 26)
c = torch.randn(8, 26, 128, 128)
h = torch.randn(8, 1, 50, 1)

for bs in range(c.size(0)):
    # compute Bc
    Bc = torch.matmul(B.unsqueeze(0), torch.reshape(c[bs, :, :, :], (c.size(1), c.size(2) * c.size(3))))
    # compute hBc
    hBc = F.conv2d(Bc.unsqueeze(0), h[bs, :, :, :].unsqueeze(0).flip(2), padding=nPadding).squeeze(0)
    # reshape hBc
    output = torch.reshape(hBc, (1, h.size(2), c.size(2), c.size(3)))
    outputs.append(output)

Any help would be appreciated. Thanks!

Your padding is undefined and removing it raises a shape mismatch.
Could you also explain what B and c are as it seems Bc is the actual input?
Is the calculation of Bc relevant for the "batch-wise` conv or just a preprocessing step?

I defined the padding as following for the convolution

nPadding = h.size(2)-1

You are right that Bc is the actual input. However, the matrix multiplication between B and c are performed batch-wise, i.e. each batch c is multiplied by a matrix B. The product Bc then is convolved with h batch-wise. Hope this would be clear.

This this nPadding value I still see:

RuntimeError: shape '[1, 50, 128, 128]' is invalid for input of size 1631718

I missed one step after convolution, here is the updated code and should work

import torch.nn.functional as F
import torch



nFrame = 50
B = torch.randn(50, 26)
c = torch.randn(8, 26, 128, 128)
h = torch.randn(8, 1, 50, 1)
nPadding = h.size(2) - 1
outputs = []

for bs in range(c.size(0)):
    # compute Bc
    Bc = torch.matmul(B.unsqueeze(0), torch.reshape(c[bs, :, :, :], (c.size(1), c.size(2) * c.size(3))))
    # compute hBc
    hBc = F.conv2d(Bc.unsqueeze(0), h[bs, :, :, :].unsqueeze(0).flip(2), padding=nPadding).squeeze(0)
    # remove facked indices
    hBc = hBc[:, :nFrame, nPadding:(hBc.size(2) - nPadding)]
    # reshape hBc
    output = torch.reshape(hBc, (1, h.size(2), c.size(2), c.size(3)))
    outputs.append(output)