How To Implement Custom Layer And Update weight In Pytorch

You can have a look at this link for freezing of weights.
No, not writing a backward method does not necessarily mean that the weights would not change. If the forward method of your layer has differentiable operations then, autograd takes care of backward by default.

1 Like