Is this possible to give gradient clipping in a specific layer?

I have been trying to implement DRQN. In the last page, I can see “LSTM’s gradients were clipped to a value of ten to ensure learning stability” in Appednix C : Experimental Details. Is this possible? Only thing I know is to give clipvalue as a parameter of optimizer.

Yes, torch.nn.utils.clip_grad_norm_ will clip the gradients of the passed parameters.

2 Likes