How to embed Cartesian coordinates

Hi!

Im trying to use a transformer to predict a sequence of X,Y coordinates. My question is how I would go about embedding the coordinates to a D-dimensional space in order to input it in my Transformer model.

So far I’ve done something like the following:

First I scale the coordinates to numbers between 0-100.

input_dim = 100
embedding_dim = 256
embedding = nn.Embedding(input_dim, embedding_dim)

input_to_embed = torch.tensor([[20,30], [30, 20]])  ## [[X,Y],[X,Y]]  two positions of x,y coordinates
embed = embedding(input_to_embed)
print(embed.shape)

However, when the coordinates have decimals (like [20.3, 30.4]), I get an error because the values does not have scalar type Long.

I suspect that this is not the best way to solve the problem so I am happy to hear your thoughts and suggestions!

Hello, did you get any solution? I am facing similar situation.