Unable To Save Libtorch Optimizer That Isn't Declared Locally

I have an Adam optimizer defined in my class “MyPolicy” that I use to do all of the training on my policy.

In MyPolicy.h I declare a shared pointer to the optimizer:

std::shared_ptr<torch::optim::Adam> m_AdamOptimizer;

and in the constructor for MyPolicy.cpp:

m_learning_rate = 0.0003;
m_agent = std::make_shared<Agent>();
m_AdamOptimizer = std::make_shared<torch::optim::Adam>(
    m_agent->parameters(), torch::optim::AdamOptions(m_learning_rate).eps(1e-5)
);

This works perfectly, and I’ve been able to train my policy as expected.

However, I can’t save the adam optimizer, because when I include the line of code to save it:

std::string optimizerFileName = "myOptimizer.pt";
torch::save(m_policy.m_optimizer, optimizerFileName);

I get this error:
error C2679: binary '<<': no operator found which takes a right-hand operand of type 'const Value' (or there is no acceptable conversion)

I understand that the “proper” way to initialize the optimizer would be without a shared pointer to it, such as with this:

torch::optim::Adam optimizer(
    m_agent->parameters(), torch::optim::AdamOptions(m_learning_rate).eps(1e-5)
);

But this variable “optimizer” is only declared locally, so I can’t reference it outside of where it’s declared.
I can’t create a new local optimizer before I save and set it equal to m_policy.m_optimizer because you get a attempting to reference a deleted function error, and I can’t declare the optimizer regularly (not as a shared pointer) in the h file because there isn’t a default constructor for torch::optim::Adam. Is there a way to declare an optimizer in the h file and still be able to save it?

Thank you!

I figured this out, for some reason you have to dereference the optimizer before it is saved and loaded. This is extremely confusing since you don’t need to, and in fact shouldn’t dereference any nn module when saving or loading.

torch::save(m_policy.m_agent, agentFileName);
torch::save(*m_policy.m_optimizer, optimizerFileName);

Hopefully this helps somebody.