but it depends on Lin and Lout. I want them to be the same, but when defining my network, I don’t know what my input size will be. In my __init__, I have:
The usual trick here is that if your stride is 1 an dilation is 1and your kernel has an odd size, you can set the padding to be floor(kernel_size/2).
If you use a stride that is not 1, you won’t be able to remove l_in from the formula above I’m afraid.
Well if you have a stride that is not 1, it is very weird to ask for the output to be the same size no? Or you should ask the output to be input/stride?
For example: If you use a stride of 2 and kernel of 3 for an input of size 10, you get would get according to your formula above that the padding should be 5. That means that your output will contain mostly 0s as you have many applications of your kernel that just end up completely in the padding.
Assuming dilation of 1 as you don’t use it in your code, you have (kernel - 1)/2.
Since pytorch only supports padding as an integer number (what would be non-integer padding?), then you need kernel to be an odd number and you have a padding that can be (kernel - 1)/2