Mutihead attention

Should the dimension of input be the same as the embedding dimension parameter that is passed into the multihead attention in pytorch ?