Why is there a ReLU after the Embedding in the Seq2Seq tutorial?

Why is there a ReLU placed after the embedding in the SimpleDecoder of the pytorch seq2seq tutorial?