RuntimeError: One of the differentiated Variables does not require grad

I want to estimate the Fisher Information matrix and in the network I have set some require_grads = False because I do not want them to be updated:

That is what I did
loglikelihood_grads = autograd.grad(loglikelihood, self.parameters())

    parameter_names = [
        n.replace('.', '__') for n, p in self.named_parameters()
    ]

    return {n: g**2 for n, g in zip(parameter_names, loglikelihood_grads)}

I got the following error RuntimeError: One of the differentiated Variables does not require grad.
But if I replace self.parameters() with filter(lambda p: p.requires_grad, self.parameters()) in the loglikelihood_grads = autograd.grad(loglikelihood, self.parameters())
how can I find the proper order of names in the loglikelihood_grads that corresponging to?

You can use self.named_parameters() to iterate over pairs n,p.

Best regards

Thomas

Hi:
Yeah, I used self.named_parameters() to iterate the name and parameters.
But how do I know, which name and parameters corresponding to the loglikelihood_grads calculated by
loglikelihood_grads = autograd.grad(loglikelihood, filter(lambda p: p.requires_grad, self.parameters()))

Does something like names = [n for n,p in self.named_parameters() if p.requires_grad] not work? You could also zip that with loglikelihood_grads if you want pairs…

Best regards

Thomas

thank you. I will test this. :slight_smile: