Trivial model on iOS gets EXC_BAD_ACCESS on forward call

I think I’m missing something basic, so apologies if this is obvious.

Running pytorch 1.9.1; ios LibTorch-Lite 1.9.0

I’m trying to run a model in iOS under c++, and during the call to forward(), the get_method("forward") call works, and then the thread crashes on the std::move(inputs) call with a EXC_BAD_ACCESS error.

I’ve created a trivial model in python and saved it for mobile:

class SampleModel(torch.nn.Module):
    def forward(self, images):
        return torch.mean(images, dim=(1, 2, 3))

model = SampleModel()

scripted_model = torch.jit.script(model)
scripted_model._save_for_lite_interpreter(path)

in iOS I’m using trivial run logic:

torch::jit::mobile::Module model = ::torch::jit::_load_for_mobile(path);

c10::InferenceMode guard;
std::vector<torch::jit::IValue> inputs;

inputs.push_back(torch::ones({1, 1, 224, 224}, torch::kFloat32));

at::Tensor result = model.forward(inputs).toTensor();

This is as trivial an example as I’ve been able to build and I consistently get the EXC_BAD_ACCESS exception. I tried saving a standard torchscript (non-mobile) and switching to LibTorch to see if it’s the mobile runtime, and I get an internal assert being thrown instead.

Anyone seen this or have thoughts on what I could be doing wrong?

Thanks in advance.

Resolved - moral of the story is make sure that there aren’t similarly named private variables in your c++ class. Module was being assigned to the wrong one. Please disregard…

I didn’t see any obvious problem except “c10::InferenceMode guard” should be called before _load_for_mobile. Could you paste the full error message?

Couldn’t figure out how to close the post - in the full code the model was loaded into a variable that was in a different scope. Dumb error - working now. Thanks!

1 Like