I have two networks with the same structure,
class MLP : public torch::nn::Module
{
private:
torch::nn::Linear input{nullptr};
torch::nn::Linear hidden{nullptr};
torch::nn::Linear output{nullptr};
public:
MLP();
~MLP();
MLP(const int inDim, const int outDim);
torch::Tensor forward(torch::Tensor x);
};
std::shared_ptr<MLP> NetA;
std::shared_ptr<MLP> NetB;
and I only train NetA
and update its parameters. And at some moment, I want to copy parameters of NetA
to NetB
.
I want to know if there exist some function/method in LibTorch allows me to do this directly? like assignment operator
Thanks.