I am doing some experiments on positional encoding, and would like to use torch.nn.Transformer for my experiments.
But it seems there is no argument for me to change the positional encoding. I also cannot seem to find in the source code where the torch.nn.Transformer is handling tthe positional encoding.
How to change the default sin cos encoding to some of my custom-made encoding?
Hi, i’m not expert about pytorch or transformers but i think nn.Transformer doesn’t have positional encoding, you have to code yourself then to add token embeddings.