_ivalue_ INTERNAL ASSERT FAILED at "../torch/csrc/jit/api/object.h":37, please report a bug to PyTorch


I am trying to load a trained python, pytorch “.pt” file into a C++ application. I have had no problem in the past doing this with a similar code block:

    torch::Tensor output = m_model.forward({xTensor}).toTensor();
    torch::Tensor rndOutput = output.round().to(torch::kInt);
    predCls = rndOutput.item<int>();

} catch(std::exception &e) {
    std::string errMsg = e.what();
    std::cout << errMsg << std::endl;

I get this error at the m_model.eval() line…

The entire e.what() msg is:

ivalue INTERNAL ASSERT FAILED at “…/torch/csrc/jit/api/object.h”:37, please report a bug to PyTorch.
Exception raised from _ivalue at …/torch/csrc/jit/api/object.h:37 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits, std::allocator >) + 0x6b (0x7ffff7f7005b in /home/brendan/libtorch/lib/libc10.so)
frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, char const*) + 0xc9 (0x7ffff7f6b0e9 in /home/brendan/libtorch/lib/libc10.so)
frame #2: + 0x45c23fb (0x7fffe4b9e3fb in /home/brendan/libtorch/lib/libtorch_cpu.so)
frame #3: torch::jit::Module::train(bool) + 0x6da (0x7fffe4b98d3a in /home/brendan/libtorch/lib/libtorch_cpu.so)
frame #4: + 0xac559 (0x555555600559 in /home/brendan/Repos/prodml/build/main)
frame #5: + 0xa9dd0 (0x5555555fddd0 in /home/brendan/Repos/prodml/build/main)
frame #6: + 0xddd71 (0x555555631d71 in /home/brendan/Repos/prodml/build/main)
frame #7: + 0xda105 (0x55555562e105 in /home/brendan/Repos/prodml/build/main)
frame #8: + 0xdc36f (0x55555563036f in /home/brendan/Repos/prodml/build/main)
frame #9: + 0x9dd73 (0x5555555f1d73 in /home/brendan/Repos/prodml/build/main)
frame #10: + 0x29d90 (0x7fffdfe29d90 in /lib/x86_64-linux-gnu/libc.so.6)
frame #11: __libc_start_main + 0x80 (0x7fffdfe29e40 in /lib/x86_64-linux-gnu/libc.so.6)
frame #12: + 0x13aa5 (0x555555567aa5 in /home/brendan/Repos/prodml/build/main)

Any help would be appreciated, thank you.