This is a very strange error I’m getting:
Exception has occurred: RuntimeError Only support when num_heads is even in transformer
It’s being called in my
forward method even though I’m not doing inference (just training):
def forward(self, indices, mask): x = self.embed(indices) x = self.encoder(x, src_key_padding_mask=mask)
I have the number of heads set to 1 and it’s never caused an issue in the past. Same code works when
Edit: I have verified that this behavior changed going from v.1.11 to v.1.12. Is this change in behavior documented somewhere? I can’t find it.