Using nn.Conv1D when batch size varies in the loader

I am using an input with size batchsize x nFeatures.
The data loader is as follows:

train_loader = DataLoader(peakspectra_tic, batch_size=bSize, shuffle=True, num_workers=1, drop_last=False)

And the conv layer is:

    def convlayer_(self, inchannels, outchannels, outputsize):
        layer = nn.Sequential(
            nn.Conv1d(in_channels=inchannels, out_channels=outchannels, kernel_size=3, stride=1, padding=1),
            nn.BatchNorm1d(num_features=outputsize),
            nn.ReLU()
        )
        return layer

When the batch size is equal to bSize it runs but in the last batch(drop last) which is not equal it runs into error.
How to deal with this?

All layers are accepting a variable batch size, so I assume:

means you are using the actual batch size as the number of channels?
If so, use drop_last=True in your DataLoader to drop the last (smaller) batch since your model depends on the fixed batch size and will fail otherwise (note that this approach is also uncommon since your inference use case would also depend on the same fixed batch size).