Gradient explode during backward pass

Data(ranges between -1 to 1) after normalization(ranges between 0 to 1) passed through customize function which used the “exponential function” with some mathematical operation. During the backward pass, the gradients are exploding which change the weights value such that input values to the customize function goes out of bound(less than 0 and greater than 1) in next iteration. Backward pass is normal but can not figure out the issue.

You can reduce the size of gradients using `torch.nn.utils.clip_grad_norm.