How does one make sure that the parameters are update manually in pytorch using modules?

Have a look at my answer in What is the recommended way to re-assign/update values in a variable (or tensor)?

Whenever we have an underscore in the end of a function in pytorch, that means that the function is in-place.
So x.sub_(w) is the same as x -= w for x and w tensors.

2 Likes