Backpropagate the loss of a certain layer

Hello all
How can I backpropagate loss over a specific layer of network (not the all the layers or the layers before the specific layer). I want to update the weights of a certain layer.
Thanks

You could pass the desired parameters as the inputs argument to the .backward() call, which would then accumulate the gradients only in these parameters and will ignore the rest.

I wonder if you would share with me any example code.

This should work:

model = models.resnet18()

x = torch.randn(2, 3, 224, 224)
out = model(x)
loss = out.mean()

loss.backward(inputs=list(model.conv1.parameters()))

for name, param in model.named_parameters():
    print(name, param.grad)
2 Likes