Can I use nn.Transformer without the masked attention?

I’m trying to use a Transformer to do a seq2seq task, but I don’t want it to be autoregressive. Is there any way for me to do this?