Setting some elements of layer parameters to 0 for each training iteration

With a bit of luck, I found some answers that are close to what I want to do.

Can.detach() work for parts of the layer weights?
and
Update only sub-elements of weights

It seems that these are not the optimal solution in terms of saving resources since it uses the gradient mask and sets the gradient to 0 where needed. However, this seems to be the only possible solution for me right now.