What is the optimal way to train model Right to left

Hello everyone :slight_smile:

I’m using standard TRANSFORMER for NMT and I’m going to train model Right to left and I have two ideas:

  1. reversing the input text from left to Right, before feeding the data to the encoder and decoder.

  2. reversing the embedding vector in the encoder and decoder.

My question is what the optimal way to train model Right to Left?

Note: I have two versions of my model (RTL model) and (LTR model).