I want a batchnorm with no bias. so I write like this:
self.bn = BatchNorm1d(2048)
self.bn.bias.requires_grad_(False) # no shift
And what I want to do is meta-learning, I need to calculate the grad, so I write
grad_info = torch.autograd.grad(loss_meta_train, self.model.module.params(), create_graph=True)
to require backward, but I get this error:
RuntimeError: One of the differentiated Tensors does not require grad
the Tensor is BatchNorm’s bias, how can I solve this question