I tried to normalize the input during the forward pass of the model doing this:
class Model(nn.Module):
def __init__(self):
mean = torch.as_tensor([0.485, 0.456, 0.406])[None, :, None, None]
std = torch.as_tensor([0.229, 0.224, 0.225])[None, :, None, None]
self.register_buffer('mean', mean)
self.register_buffer('std', std)
...
def forward(self, inputs):
# Input size [batch, channel, width, height]
# Normalize inside the model
inputs = inputs.sub(self.mean).div(self.std)
...
return output
During training everything is fine and working but when I switch to eval()
mode, model starts to give random outputs. Disabling eval()
helps to get meaningful outputs during validation, but I need eval()
mode since I use dropout and batchnorm in the model. Any idea what causes this weird behavior?