Hello everyone
I’m using standard TRANSFORMER for NMT and I’m going to train model Right to left and I have two ideas:
-
reversing the input text from left to Right, before feeding the data to the encoder and decoder.
-
reversing the embedding vector in the encoder and decoder.
My question is what the optimal way to train model Right to Left?
Note: I have two versions of my model (RTL model) and (LTR model).