Gradient clipping in DNN

how to prevent the explosion of gradients of weights via gradient clipping

You can apply torch.nn.utils.clip_grad_norm_ or torch.nn.utils.clip_grad_value_ on the desired parameters.