How to roll back the impact of a loss function with less cost?

I am new in pytorch. I want to know how to roll the impact of a loss back with very less cost.
For example, a model H with parameters p at a moment. Then a loss L is calculated based on the parameters and the model. It works like:

L =  criterion(prediction, target)
L.backward()
optimizer.step()

Then the parameters p become p’, and I want to get the parameters p back, or parameters approximating to p. How can I do?
How to analyze a operation like:

L =  criterion(prediction, target)
L1 = -L
L.backward()
optimizer.step()
optimizer.zero_grad()
L1.backward()
optimizer.step()

Is it ok? Thank you!