While running load traced_torch_model.ot exit with code -1

Hello all, I followed the tutorial and would like to run traced_torch_model.pt in C++. However, the app existed code -1 automatically. Any advice is appreciated.

#include <iostream>
#include <memory>

using namespace std;
#include <torch/script.h>
#include <iostream>
#include <memory>

int main(int argc, const char* argv[]) {
    if (argc != 2) {
        std::cerr << "usage: example-app C:\\Users\\hchang\\Documents\\Gladies\\Python at work\\0813_2021\\traced_torch_model.pt\\";
            return -1;
    }

    torch::jit::script::Module module;
    try {
        // Deserialize the ScriptModule from a file using torch::jit::load().
        module = torch::jit::load(argv[1]);
    }
    catch (const c10::Error& e) {
        std::cerr << "error loading the model\n";
        return -1;
    }
    std::cout << "Model " << argv[1] << " loaded fine\n";

    // Create a vector of inputs.
    std::vector<torch::jit::IValue> inputs;
    inputs.push_back(torch::randn({ 1, 1, 64, 101 }));

    // Execute the model and turn its output into a tensor.
    at::Tensor output = module.forward(inputs).toTensor();
    std::cout << output << "\n";
    int y_hat = output.argmax(1).item().toInt();
    std::cout << "Predicted class: " << y_hat << "\n";
}

I assume you are not seeing any error messages?
If so, then try to remove the model loading part of your code and check if a simple example of creating a dummy model and random inputs would work. If that’s the case, check if your paths etc. are valid to load the scripted model.