model = Sequential(Linear(8,5), Linear(5,3), Linear(3,1))
model.weight, I want to freeze part of this weight matrix during the training procedure i.e.,
values shouldn’t vary during the backpropagation. Apparently,
model.weight[0:3,0:3].detach() isn’t working as the weights vary during the training. And
model.weight[0:3,0:3].requires_grad=False throws error.
So, how to achieve this? Thank you in advance.