How can I wrap a parameter/tensor into a embedding matrix for look up

Hi, I have a learnable parameter/tensor, which is in the computational graph.And is there any way to wrap it into an nn.Embedding for look up.
Thanks

you can do a lookup by just indexing into it.

x = torch.randn(100000, 128) # parameter Tensor
indices = torch.tensor([3, 44, 193, 2], dtype=torch.int64)

output = x[indices]

Thanks for your reply.
This is exactly what I do right now.
I am wondering is it efficient as nn.Embedding which support GPU look up.