Hi everyone,
I was wondering what is the best way to scale the gradient before steping the weights during backpropagation. For example, lets say I have weight w, learning rate alpha and some custom function f(x). Before updating the weight, I want to scale the gradient based on my custom function f(x) as shown below. Thank you for your time and help.