Named arguments and dict output with AOTInductor

  std::vector<at::Tensor> AOTModelPackageLoader::run(
      const std::vector<at::Tensor>& inputs,
      void* stream_handle = nullptr);

I’d like to use AOTInductor to do inference for a PyTorch model whose forward method returns a dict[str, Tensor]. The forward method takes a large number of arguments as well, so ideally for robustness I’d be able to call it with named arguments (ie. a GenericDict) from the C++ side. However based on the above signature, you must only run the model with a vector of tensors as input and expect only a vector of tensors as output. Is there another way to run the module but have it take and expect arbitrary data structures like an arbitrary IValue?