NMT-SMT hybrid system with Pytorch and Moses smt

Hi. I’m new in language processing and i’m facing a great problem to link an encoder-decoder with attention result with Moses smt. In fact i have found this repository: Wmt17 scripts, on github to preprocess, postprocess, train and evaluate a such system but the encoder-decoder with attention is written in Tensorflow or Theano. I would like to produce the same result with pytorch. You should noticed that the encoder-decoder with attention translate a sentence without working on words that were tag with the token and then pass the corresponding result to th Moses smt to translate only words that are tag with token. But i don’t have any idea on how to begin.