About freezing parameters?

I wrote some code to freeze part of my model.

for param in model.network.reasoner.parameters():
param.requires_grad = False

then for the optimizer, I wrote

parameters = [p for p in self.network.parameters() if p.requires_grad]
self.optimizer = optim.SGD(parameters, lr=self.args.learning_rate, momentum=self.args.momentum, weight_decay=self.args.weight_decay)

However, when I do

loss.backward

I got the following error:

RuntimeError: inconsistent range for TensorList output

When I do not freeze the model, the whole model works fine. Any idea why this error happens?