I am pretty new to Pytorch and keep surprised with the performance of Pytorch
I have followed tutorials and there’s one thing that is not clear.
How the optimizer.step() and loss.backward() related?
Does optimzer.step() function optimize based on the closest loss.backward() function?
When I check the loss calculated by the loss function, it is just a Tensor and seems it isn’t related with the optimizer.
Here’s my questions:
(1) Does optimzer.step() function optimize based on the closest loss.backward() function?
(2) what happens if I call several different backward() from losses and call optimizer.step()?
Does the optimizer optimize based on all previous called losses?