How to inference dictionary data by libtorch API?

Hi, I am trying to inference structured data by libtorch API, the network is similar to vectorNet. And, i want to inference dict data on gpu. The cpu version c++ code as follow, how to modify to gpu version in data construct part?

// load script model
torch::jit::script::Module module = torch::jit::load(model_path);

// construct fake input data[0]
std::unordered_map<std::string, std::vector<torch::Tensor>> actors_data;
actors_data["types_int"] = {torch::randn({12, 1}), torch::randn({23, 1})};                      // [5, 1]
actors_data["history_speeds"] = {torch::randn({12, 16, 2}), torch::randn({23, 16, 2})};         // [5, 16, 2]
actors_data["history_headings"] = {torch::randn({12, 16, 1}), torch::randn({23, 16, 1})};       // [5, 16, 1]
actors_data["history_trajs"] = {torch::randn({12, 16, 3}), torch::randn({23, 16, 1})};          // [5, 16, 3]
actors_data["history_trajs_relative"] = {torch::randn({12, 16, 3}), torch::randn({23, 16, 3})}; // [5, 16, 3]

// construct fake input data[1]
std::unordered_map<std::string, std::vector<std::vector<torch::Tensor>>> lanes_data;            //[0][32, 6]   [1][32, 6, 60, 2]
lanes_data["centerlines"] = {{torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2})}, 
        {torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2}), torch::randn({10,2})}};

// construct torch::jit::IValue
std::vector<torch::jit::IValue> inputs;
inputs.push_back(actors_data);
inputs.push_back(lanes_data);

// do inference
auto outputs = module.forward(inputs).toTuple();

You could either move the tensors to the GPU via:

tensor = tensor.to({at::kCUDA, 1});

or specify the device directly during tensor creation via:

at::randn({12, 1}, at::kCUDA);

Thanks for your reply. I have solved this problem by only move dict’s value to gpu, key left. I was confused by how to move full entity(including key and value) of dict to gpu, yesterday.