State of art way to do batchnorm between embedding layer and LSTM layer

What is the state of art way to do batchnorm between embedding layer and LSTM layer for padded sequences?