How to copy parameters of one network to another directly in Libtorch C++?

I have two networks with the same structure,

class MLP : public torch::nn::Module
    torch::nn::Linear input{nullptr};
    torch::nn::Linear hidden{nullptr};
    torch::nn::Linear output{nullptr};


    MLP(const int inDim, const int outDim);
    torch::Tensor forward(torch::Tensor x);

std::shared_ptr<MLP> NetA;
std::shared_ptr<MLP> NetB;

and I only train NetA and update its parameters. And at some moment, I want to copy parameters of NetA to NetB.

I want to know if there exist some function/method in LibTorch allows me to do this directly? like assignment operator


saving and loading is ok, but not elegant

std::shared_ptr<MLP> NetA;
std::shared_ptr<MLP> NetB;

// ...

torch::save(NetA, "./");
torch::load(NetB, "./");

REF: (libtorch) How to save model in MNIST cpp example?