Accessing objective function's value from optimizer

Hello, I’m trying to implement custom optimizers, then need to get values of objective functions. As far as I understand, suppose f(x) is an objective function,

  • p.data is x
  • p.grad.data is the gradient of f at x

where

for group in self.param_groups:
    for p in group['params']:
         pass

Is there any good way to get the value f(x) from optimizer? Thank you.

If you need to use the value of a function inside the step() function you need to implement your optimizer as requiring that closure argument. Once you call the closure it will return you the loss and you can inspect its value. You can take a look at how L-BFGS is implemented.

@apaszke Thank you.
but still I’m not sure what kind of closure is required.
If your loss function is, for example nn.NLLLoss or F.nll_loss, then what is closure?
Please show me an example or detailed explanation. Thank you.

You can find more about the closures in optim docs.

@apaszke Thank you, I skipped to the next section.
Finally my implementation worked with closure()