How to get parameters’ gradients with "torch.nn.utils.prune"?

May I ask how to get parameters’ gradients after applying pruning masks?

I applied prune.global_unstructured and pruning_method=prune.RandomUnstructured to ResNet50.

I got the following observations. It seems pruned nn.Parameter are converted into plain nn.Tensor, and I can no longer get their gradients.

* remain weight ratio =  10.01393285846195 %
-> optimizer.step() # after loss.backward(), before gradient descent step
(Pdb) model.conv1.weight.grad.shape # I intentionally avoid pruning conv1
torch.Size([64, 3, 3, 3])
(Pdb) model.layer1[0].conv1.weight.grad.shape # I prune all other weights
main_train.py:1: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the gradien
t for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 fo
r more informations.
  '''
*** AttributeError: 'NoneType' object has no attribute 'shape'
(Pdb) model.layer1[0].conv1.weight.shape
torch.Size([64, 64, 3, 3])
(Pdb) type(model.layer1[0].conv1.weight)
<class 'torch.Tensor'>
(Pdb) type(model.conv1.weight)
<class 'torch.nn.parameter.Parameter'>