How can I ensure that my Conv1d retains the same shape with unknown sequence lengths?

I get the equation:

but it depends on Lin and Lout. I want them to be the same, but when defining my network, I don’t know what my input size will be. In my __init__, I have:

        l_in = 128
        padding1 = l_in - 1
        padding1 *= strides[0]
        padding1 += 1
        padding1 -= l_in
        padding1 += kernel_sizes[0] - 1
        padding1 /= 2
        print('padding1', padding1)
        padding1 = int(padding1)
        self.conv1 = torch.nn.Conv1d(

This works well if I know what l_in is. But what if I don’t? For future sequences, I won’t have the length when defining the model.

Is it possible for me to define my padding in a way that’s independent from sequence length?


The usual trick here is that if your stride is 1 an dilation is 1and your kernel has an odd size, you can set the padding to be floor(kernel_size/2).
If you use a stride that is not 1, you won’t be able to remove l_in from the formula above I’m afraid.

So then I have to use a stride of 1 if I have unknown lengths?

Well if you have a stride that is not 1, it is very weird to ask for the output to be the same size no? Or you should ask the output to be input/stride?

Maybe I can pad the inputs to fit something before passing to my model?

For example: If you use a stride of 2 and kernel of 3 for an input of size 10, you get would get according to your formula above that the padding should be 5. That means that your output will contain mostly 0s as you have many applications of your kernel that just end up completely in the padding.

Stride of 1 it is :slight_smile:

So for stride of 1, you have:

padding = dilation * (kernel -1) / 2

Assuming dilation of 1 as you don’t use it in your code, you have (kernel - 1)/2.
Since pytorch only supports padding as an integer number (what would be non-integer padding?), then you need kernel to be an odd number and you have a padding that can be (kernel - 1)/2 :slight_smile: