Autograd Errors when running in Pytorch

With Learnable Parameters

m = nn.BatchNorm2d(100)

Without Learnable Parameters

m = nn.BatchNorm2d(100, affine=False)

Error with input = autograd.Variable(torch.randn(20, 100, 35, 45))

input = Variable(torch.randn(20, 100, 35, 45))
output = m(input)

What happens when I run autograd it made the error? It’s probably work when running without it.