Libtorch, how to add a new optimizer

For test, I copy the file “adam.h” and “adam.cpp”, and change all Related keyword “Adam” to “MyAdam”, and include “adam.h” in “optim.h”. After compiling, when I use “MyAdam” in new code, the compiler aborted undefined symbols:

Undefined symbols for architecture x86_64:
  "torch::optim::MyAdamOptions::MyAdamOptions(double)", referenced from:
      _main in Net_test.cc.o
  "torch::optim::MyAdam::step(std::__1::function<at::Tensor ()>)", referenced from:
      _main in Net_test.cc.o
  "vtable for torch::optim::MyAdamOptions", referenced from:
      torch::optim::MyAdamOptions::MyAdamOptions(torch::optim::MyAdamOptions const&) in Net_test.cc.o
  NOTE: a missing vtable usually means the first non-inline virtual member function has no definition.
  "vtable for torch::optim::MyAdam", referenced from:
      torch::optim::MyAdam::MyAdam(std::__1::vector<torch::optim::OptimizerParamGroup, std::__1::allocator<torch::optim::OptimizerParamGroup> >, torch::optim::MyAdamOptions) in Net_test.cc.o
  NOTE: a missing vtable usually means the first non-inline virtual member function has no definition.
ld: symbol(s) not found for architecture x86_64

Is there something I miss ?

For example:

    auto opt = torch::optim::MyAdam(param);
    auto options = static_cast<torch::optim::MyAdamOptions&>(opt.defaults());

@freezek, the implementation for certain libtorch classes are not strictly contained in single cpp file. You might need to copy all implementation files to complete MyAdam. If you want to do this kind of copying, you have to search the files realted to Adam in the source code level (directly downloaded from pytorch github codebase), instead of searching Adam implementation in distributed libtorch version, cause distributed version already compiles some Adam class implementation, and you cannot access those.

1 Like

hi, @Lin_Jia, thanks for reply. I find add MyAdam.cpp path into caffe2/CMakeLists.txt works .

1 Like