What's the best way to monitor net parameters and gradients

I find it is tedious to monitor the net parameters and gradients when training using pytorch. I need to firstly find parameters using
p = list(net.parameters())
and find the exact index for parameters. After that, using p.data.cpu().numpy() to monitor parameters and s.grad.data.cpu().numpy()to monitor gradients.
Is there better way to monitor all the parameters in clean code? I also want to list all parameters with their names.
Thank you! Any suggestion is welcomed!