How to normalize Gradients?

I know the function torch.nn.utils.clip_grad_norm_ for clipping the gradient. Is there also one for simply normalizing it to a certain norm value?