The model still updates the weight of the freezing layers in training

Hi, I’m facing a strange problem. After unfreezing a little portion of the part in model, the model starts to update the whole model though the variance is small. When i took the output from the previous layer before model.inception the output got changed. Theoretically or mathematically it should not go in this way. Can you suggest which problem caused this?

My code is here:
model =oldmodel()
model.load_state_dict(torch.load(MODEL_PATH))
for param in model.parameters():
param.requires_grad = False
for param in model.inception.parameters():
param.requires_grad = True

Could you post a minimal and executable code snippet reproducing the issue?