Freezing just a few weights (instead of the whole layer)

Heya, I was trying to replicate the pruning experiments mentioned in the paper, The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks which is written in Tensorflow. I have managed to create the mask and apply it but unable to figure out how to freeze the pruned weights. As much as I checked, it is only possible to freeze one layer or a parameter at once, not the individual weights.

Tried looking at Rethinking network pruning (code) as it has a implementation but couldn’t find how they froze the pruned weights. In another code Lottery ticket in Pytorch, they explicitly zeroed the gradients for pruned weights which seems cumbersome and extra computation for auto_grad as the gradients are still being computed. Is there a workaround?

Also, if there is no other way can backward hooks make the process a little bit easier as I am trying to use this with fastai package. Any help is really appreciated.

1 Like

Check out the pruning functionalities added to PyTorch v1.4 under torch.nn.utils.prune. They were built with lottery ticket hypothesis research in mind. The masks are added as forward pre-hooks, and are therefore automatically included as an operation in the backward pass too, thus effectively freezing pruned parameters.

You might also want to check out the PyTorch pruning tutorial: https://pytorch.org/tutorials/intermediate/pruning_tutorial.html

1 Like