How to keey the weight of conv layer unchanged?

hello, everyone.
I want to sum up the elements within every filter scope, so i consider weither i can define a conv layer and initialize the weight as 1 and keep them unchanged?
Besides i also need the gradient to backprop from top to bottom. Is keeping the learning rate zero the solution?
Thanks!

consider weither i can define a conv layer and initialize the weight as 1 and keep them unchanged?

Yes.

m = nn.Conv2d(...)
m.weight.requires_grad=False
m.weight.data.fill_(1)
2 Likes