For example:
a 3*3 con’s parameters(third layer):
[[1,2,3],[4,5,6],[7,8,9]]
I want to modify 1,2,9 by gradients from backward(),and keep 3,4,5,6,7,8 invariant.
And i don’t want the gradients of 3,4,5,6,7,8 join in computation of gradients of the second layers’ parameters.
My English is poor,and i am a new hand in deep learning.
Thanks for your advice.