Pytorch set parameters (weight and bias) of layers

I have constructed a network using C++ API. I want to assign my network layers to specific weight and bias that I get from pretrained tensorflow model. How can I achieve this? One of my layer declaration is as follows:

    torch::nn::Sequential conv1_1{

For now I am assigning parameters randomly as follows:

    torch::NoGradGuard no_grad;
    for (auto& p : this->parameters()){

How can I assign a tensor as weight to a layer?

I know that model->conv1_1->parameters()[0] represents the weights and model->conv1_1->parameters()[1] represents the bias for conv1_1. But I can’t set weights with:

torch::Tensor conv_wght = torch::zeros({64, 3, 3, 3},torch::TensorOptions().dtype(torch::kFloat32).device(torch::kCUDA, 0));
model->conv1_1->parameters()[0] = conv_wght;

Anyone can give me a clue?

for (const auto& pair : this->conv1_1->parameters()) {
        pair.copy_(in); // in is torch::Tensor type

copy_() solves my problem. Just be careful about in tensor size since this loop will also try to update bias.