Why is there a ReLU placed after the embedding in the SimpleDecoder of the pytorch seq2seq tutorial?
ReLU
embedding
SimpleDecoder