Different weights in loss.backward() and optimizer.update()

I want to use different weights in the backward pass and the update pass as

w1.grad = dloss/dw1
w(new) = w2 - lr * w1.grad

instead w(new) = w1 - lr * w1.grad

How can I do that?