Convolution forward Error about dimension

Hi there, I have got the following ModuleList named path1 in my network definition:

ModuleList (
(0): Conv2d(3, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(1): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True)
(2): LeakyReLU (0.1, inplace)
(3): MaxPool2d (size=(2, 2), stride=(2, 2), dilation=(1, 1))
(4): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(5): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True)
(6): LeakyReLU (0.1, inplace)
(7): MaxPool2d (size=(2, 2), stride=(2, 2), dilation=(1, 1))
(8): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(9): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True)
(10): LeakyReLU (0.1, inplace)
(11): Conv2d(128, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(12): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True)
(13): LeakyReLU (0.1, inplace)
(14): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(15): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True)
(16): LeakyReLU (0.1, inplace)
(17): MaxPool2d (size=(2, 2), stride=(2, 2), dilation=(1, 1))
(18): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(19): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True)
(20): LeakyReLU (0.1, inplace)
(21): Conv2d(256, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(22): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True)
(23): LeakyReLU (0.1, inplace)
(24): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(25): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True)
(26): LeakyReLU (0.1, inplace)
(27): MaxPool2d (size=(2, 2), stride=(2, 2), dilation=(1, 1))
(28): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(29): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(30): LeakyReLU (0.1, inplace)
(31): Conv2d(512, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(32): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True)
(33): LeakyReLU (0.1, inplace)
(34): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(35): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(36): LeakyReLU (0.1, inplace)
(37): Conv2d(512, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(38): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True)
(39): LeakyReLU (0.1, inplace)
(40): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(41): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True)
(42): LeakyReLU (0.1, inplace)
)

and also here is my input:

inputs = autograd.Variable(torch.randn(1,3,416,416))

In my forward function, I do below loop:

def forward(self, input):
    out = input
    for layer in self.path1:
        out = layer(out)
    return out

At first iteration of the loop, I mean for the first convolution layer, I receive below error:

RuntimeError: Need input of dimension 4 and input.size[1] == 32 but got input to be of shape: [1 x 3 x 416 x 416] at /py/conda-bld/pytorch_1493677666423/work/torch/lib/THNN/generic/SpatialConvolutionMM.c:47

Could you please tell me how can I solve this problem? I think something went wrong with pytorch.
By the way. I would like to say that I jointly work with keras with tensorflow backend and pytorch (Both of them were installed on anaconda). Does it make sense that this joint working causes the error

there is a clear error message there that should help you:

RuntimeError: Need input of dimension 4 and input.size[1] == 32 but got input to be of shape: [1 x 3 x 416 x 416]