Hi, I am trying to convert my LSTM model to torch script, then used in a production environment, I did it according to the guide lines in < DEPLOYING A SEQ2SEQ MODEL WITH THE HYBRID FRONTEND> (https://pytorch.org/tutorials/beginner/deploy_seq2seq_hybrid_frontend_tutorial.html) , but I encountered the problem as follows:
File “/home/ner/src/model/bilstm.py”, line 76, in init
self.forward_lstm = LatticeLSTM(lstm_input, lstm_hidden, gaz_dropout, gaz_alphabet_size, gaz_emb_dim, gaz_embedding, True, HP_fix_gaz_emb, False)
File “/home/.local/lib/python2.7/site-packages/torch/jit/init.py”, line 891, in init_then_register
_create_methods_from_stubs(self, methods)
File “/home/.local/lib/python2.7/site-packages/torch/jit/init.py”, line 852, in _create_methods_from_stubs
self._create_methods(defs, rcbs, defaults)
File “/home/.local/lib/python2.7/site-packages/torch/jit/init.py”, line 603, in _try_compile_weak_script
entry = _compiled_weak_fns.get(fn)
File “/usr/lib64/python2.7/weakref.py”, line 284, in get
return self.data.get(ref(key),default)
TypeError: cannot create weak reference to ‘builtin_function_or_method’ object
here is part of my code:
class MyLSTM(torch.jit.ScriptModule):
constants = [‘left2right’,‘use_gpu’,‘hidden_dim’]
def init(self, input_dim, hidden_dim, word_drop, word_alphabet_size, word_emb_dim, pretrain_word_emb=None, left2right=True, fix_word_emb=True, use_gpu = False, use_bias = True):
super(LatticeLSTM, self).init()
skip_direction = “forward” if left2right else “backward”
print "build LatticeLSTM… “, skip_direction, “, Fix emb:”, fix_word_emb, " gaz drop:”, word_drop
self.use_gpu = use_gpu
…
@torch.jit.script_method
def forward(self, input, skip_input_list):
# type: (Tensor,Tuple[List[List[List[int]]], bool])
skip_input = skip_input_list[0]
…
Any thoughts?