Estimate gradient of model output w.r.t its parameters

Hi
I have a model dnn_Linear. I would like to take derivative output(v2)of this model(dnn_Linear) w.r.t to its parameters.

ss = torch.autograd.grad(v2,dnn_Linear.parameters(), grad_outputs=None, retain_graph=None, create_graph=False, only_inputs=True, allow_unused=False)

I get the following error: grad can be implicitly created only for scalar outputs.

Also, it would be helpful how to estimate the gradient of model output w.r.t its parameters.

Best
Sal

You would have to reduce the output or pass a gradient in the same shape as the output to solve the issue:

# setup
model = nn.Linear(10, 10)
x = torch.randn(1, 10)

out = model(x)

# fails
torch.autograd.grad(out, model.parameters())
# RuntimeError: grad can be implicitly created only for scalar outputs

# works
torch.autograd.grad(out.mean(), model.parameters())

# also works
out = model(x)
torch.autograd.grad(out, model.parameters(), grad_outputs=torch.ones_like(out))

Thank you. It does not show errror with the proposed solution.
.
Sal