Can I modify certain parameters in convolution layers or Linear layers and keep other parameters in the same layer invariant when backwrad()?

For example:
a 3*3 con’s parameters(third layer):
I want to modify 1,2,9 by gradients from backward(),and keep 3,4,5,6,7,8 invariant.
And i don’t want the gradients of 3,4,5,6,7,8 join in computation of gradients of the second layers’ parameters.

My English is poor,and i am a new hand in deep learning.
Thanks for your advice.

Set a custom backward hook on the layer you want

Thank you!
I still have a problem:
Should I consider of using register_backward_hook? Any difference between them?

They have the same concept, but register_hook is on Variable, and register_backward_hook is on containers like nn.Sequential.