The output of Generator is given to custom loss function(no parameter learning). Custom loss function used the “exponential function” with some mathematical operation. So during backward, gradients pass through derivatives of exponential function and mathematical operation i used during forward. Finally when gradient of custom loss reaches to output, it is in range of e+04 while other loss gradients (consider multiple losses) are in range of e-04 and e-05 ( I check individual loss gradients at output through hook).
Is it needed to convert the gradient of custom loss at output (hook can be used) in range of e-04 or e-05 so that it will not dominate the total loss?