Error message when training torch.nn.Transformer: "ERROR:root:Creating dummy sequence since config seq length is greater than sample sequence"

Hi,

I got the following error message when training a transformer model using torch.nn.Transformer.

ERROR:root:Creating dummy sequence since config seq length is greater than sample sequence

Could anyone shed light on potential issues? Thanks!

You have to paste your code for debugging. Feel free to open an issue on Github and at zhangguanheng66 for the questions related to nn.Transformer.

Thanks, I will create a minimal working example!