I try to dump a model contains huggingface transformer by torch package and then load it in Torch::Deploy. During dumping, everything is OK, but it fails during loading and throws the error message:
terminate called after throwing an instance of 'std::runtime_error'
what(): Exception Caught inside torch::deploy embedded library:
Exception Caught inside torch::deploy embedded library:
ModuleNotFoundError: No module named 'torch._C._nn'; 'torch._C' is not a package
At:
<Generated by torch::deploy>(436): _do_find_and_load
<Generated by torch::deploy>(448): _find_and_load
<Generated by torch::deploy>(478): _gcd_import
<Generated by torch::deploy>(148): import_module
<Generated by torch::deploy>(25): find_class
<Generated by torch::deploy>(1526): load_global
<Generated by torch::deploy>(1212): load
<Generated by torch::deploy>(270): load_pickle
Aborted (core dumped)
Here is my model definition, only contains a huggingface tokenizer and a BERT model:
# file name: bert_model.py
import torch
from transformers import BertTokenizer, BertModel
class MyBertModel(torch.nn.Module):
def __init__(self) -> None:
super().__init__()
self._tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
self._model = BertModel.from_pretrained("bert-base-uncased")
def forward(self, x):
encoded_input = self._tokenizer(x, return_tensors='pt')
return self._model(**encoded_input)
And here is how I dump the model:
from torch.package import PackageExporter
from bert_model import MyBertModel
model = MyBertModel()
# Package and export it.
with PackageExporter("bert_package.pt") as e:
e.intern('bert_model')
e.extern("sys")
e.extern("transformers.**")
e.save_pickle("model", "model.pkl", model)
I load the dumped package in the way showed in the demo: torch::deploy — PyTorch 1.12 documentation
My huggingface transformers version is 4.20.1
Can anyone help? Many thanks!