Transformerencoderlayer init error

hello. I’m in trouble using TransformerEncoderLayer.

my torch version is 1.8.1+cu102

when I use batch_first, It shows error.

torch.nn.TransformerEncoderLayer(d_model=time_step, nhead=4, dropout=0.2, batch_first=True)

TypeError: init() got an unexpected keyword argument ‘batch_first’

1.8.1’s version does not take any batch_first argument (ref TransformerEncoderLayer — PyTorch 1.8.1 documentation ), if you want that you need to upgrate to 1.9.0

1 Like