Freeze single weight in CNN

Hi, I found that require_grad = False is feasible only for a single layer other than a single weight. Is there an approach to freeze single/individual weight in the network? BTW, I have noticed a method of setting weight.grad() to zero through a mask (How to freeze a single weight of a conv2d layer - #2 by ptrblck), but it there another direct approach?