model = Sequential(Linear(8,5), Linear(5,3), Linear(3,1))
Now, for model[1].weight, I want to freeze part of this weight matrix during the training procedure i.e.,
model[1].weight[0:3,0:3]
values shouldn’t vary during the backpropagation. Apparently, model[1].weight[0:3,0:3].detach() isn’t working as the weights vary during the training. And model[1].weight[0:3,0:3].requires_grad=False throws error.