Update weights of a particular layer instead of back propagating

I am using custom loss function to calculate loss in each layer. I need to update the weights for that layer only. Is this possible

Could you explain your approach a bit more?
Generally, if you only pass the parameters of your special layer to the optimizer, only these parameters will be updated.
Would that work for you?

consider i have 3 layers and i am calculating loss that is caused by each layer(3 loses) separately by a custom loss function and i need to update the weights

Are you able to call .backward() on your custom loss?
If so, my approach should work.

thank you first of all and can u explain how to optimize only single layer ,like we pass model.parameters() in optimizer what should be passed for single layer and yes .backward is working

For a single layer, you could pass only these parameters to the optimizer:

optimizer = torch.optim.SGD(model.layer1.parameters(), lr=1e-3)

If you need to pass some combination of different parameter sets, you can just pass them as a list:

params = list(model.layer1.parameters()) + list(model.layer17.parameters())
torch.optim.SGD(params, lr=1e-3)

Really thanks man its working.Thanks for ur help very much.