Manual back propagation in Neural network

In Neural network when performing back propagation we get change weights when we get these values we have to either add or subtract the change of weights from the existing weights. So my question is how do we get to know when to add and subtract when I am doing the back propagation(while doing manually) it’s not like we plot each iteration to check whether it’s increasing or decreasing.
Step by step back propagation example

In the above post he has either adding or subtracting the change in weights randomly there was no explanation for it.

The vanilla gradient descent formula updates the parameters as:

new weights = old weights - learning_rate * gradient

In the linked tutorial the same formula is used as far as I can see:

To decrease the error, we then subtract this value from the current weight (optionally multiplied by some learning rate, eta, which we’ll set to 0.5):

This might help

https://pytorch.org/tutorials/beginner/blitz/neural_networks_tutorial.html#update-the-weights