Issues with chatbot NLP tutorial of attention implement error

in the chatbot tutorial(https://pytorch.org/tutorials/beginner/chatbot_tutorial.html), i found the attention implement may be error!
# Forward through unidirectional GRU
rnn_output, hidden = self.gru(embedded, last_hidden)
# Calculate attention weights from the current GRU output
attn_weights = self.attn(rnn_output, encoder_outputs)
# Multiply attention weights to encoder outputs to get new “weighted sum” context vector
context = attn_weights.bmm(encoder_outputs.transpose(0, 1))
# Concatenate weighted context vector and GRU output using Luong eq. 5
rnn_output = rnn_output.squeeze(0)
because the encoder input sample are different length, so, the encoder output count is different,
but when compute the attention weight, there may should do some mask on the attention weights, but
the attention implemention in the tutorial was not masked