Transformer-Multihead self attention

Hello, i work with swin transformer block, i use shifted window to reshape a feature into smaller window, then i will make the multihead self attention. heads =6
What i want to know is for each window i will project query, key and vector 6 times?