I know the function torch.nn.utils.clip_grad_norm_ for clipping the gradient. Is there also one for simply normalizing it to a certain norm value?
1 Like
I know the function torch.nn.utils.clip_grad_norm_ for clipping the gradient. Is there also one for simply normalizing it to a certain norm value?