# Compute the gradient of the loss function

How i can compute the gradient of the loss with respect to the parameters of the network?

Hi Periklis!

Autograd to the RESCUE!

Specifically, compute a loss that depends on your network, call
loss.backward(), and autograd will do the rest. You can examine the
.grad properties of your network parameters. (This requires that your
network parameters have requires_grad = True and that you compute
your loss using (usefully differentiable) pytorch tensor operations.

Consider:

>>> import torch
>>> torch.__version__
'1.10.2'
>>> _ = torch.manual_seed (2022)
>>> network = torch.nn.Linear (2, 3, bias = False)
>>> network.weight
Parameter containing:
tensor([[-0.1474,  0.5967],
[ 0.3660, -0.1681],
>>> input = torch.arange (10.).reshape (5, 2)
>>> input
tensor([[0., 1.],
[2., 3.],
[4., 5.],
[6., 7.],
[8., 9.]])
>>> loss = (network (input)).sum()
>>> loss.backward()
tensor([[20., 25.],
[20., 25.],
[20., 25.]])


Best.

K. Frank

1 Like

Thank you so much for your answer, however my model has no this .weight .
On the other hand i tried to do something like
for param in model.parameters():