How to add gradient back rather than subtract gradient?

I am working on one project, and the update formula is
parameter = parameter + learning_rate * gradient
rather than the usual term which is subtracting.

I tried to make the learning rate negative to implement the idea but this is forbidden. So my question how to add the gradient when updating?

Hope for your reply. Thanks.


I guess the simplest thing is to get -gradient. You can do this by flipping the sign of your loss: loss = - loss.

Besides flipping the loss, you can modify the gradient update rule by using a for loop over network parameters and updating the gradient accordingly. This will give you more flexibility.

learning_rate = 0.01
for f in net.parameters(): * learning_rate * -1.0)

(Source: code snippet taken from 60-min blaze)

Thanks so much for the reply.

Thank you for the advice.