How to freeze part of the weight matrix during training?

I have:

model = Sequential(Linear(8,5), Linear(5,3), Linear(3,1))

Now, for model[1].weight, I want to freeze part of this weight matrix during the training procedure i.e.,

model[1].weight[0:3,0:3]

values shouldn’t vary during the backpropagation. Apparently, model[1].weight[0:3,0:3].detach() isn’t working as the weights vary during the training. And model[1].weight[0:3,0:3].requires_grad=False throws error.

So, how to achieve this? Thank you in advance.

I am not sure if any predefined way exists in pytorch.
You may need to write a custom linear layer to achieve it, something like below: