I also met the same error, and I solved this by installing a newer pytorch. Specifically, 1.12 doesn’t work but 1.13 works in my case. Hope that helps.
I met the same error when i am using v2.0. Unfortunately, my system strictly needs num_heads = 1
It is strange, because when nn.Module.training is set to True(i’m training my model), no error occurred.
When I apply .eval() for non-dropout model evaluation, error returns : RuntimeError: Only support when num_heads is even in transformer.
More strangely, when I abort padding_mask input, this error disappears and model can work. However, for attention model, I really need mask!
#this would avoid error, but is not a wise choice.
if self.training:
encode_X = self.transformer_encoder(X,src_key_padding_mask=mask)
else:
encode_X = self.transformer_encoder(X)
return encode_X
``