The simplest solution I can think of is that you create a wrapper class around available nn.Conv2d but instead of passing any paddings to nn.Conv2d object, you use explicit padding by using torch.nn.functional.pad.
Following post explains how to use explicit padding and wrapping it into another class that contains nn.Conv2d would pretty much easy.