Setting up sparse connections without gradients

Hello everyone,

Hope you are having a productive day.

I am trying to build a model which has

  • sparse connections between layers
  • some weights that should not be updated

For example,

self.L_vc = nn.Linear(E, E, bias=False)
for p in self.L_vc.parameters():
    p.requires_grad = False
prune.custom_from_mask(self.L_vc, name='weight', mask=vc_mask)

Here, I specify the connectivity between the neurons using the tensor vc_mask and preventing the weights from being updated by setting p.requires_grad = False. I also set bias = False since I just want L_vc to specify the connectivity just like an adjacency matrix. I am not sure if this is the correct way to do that.

Am I interrupting the gradients from propagating backwards by setting a constant layer like this? What should I expect if I have a layer like this in the middle of the model? Any help would be highly appreciated.


Since you have mentioned the below lines, the grads will propagate, but not get updated. This will serve your purpose