Randomly initialized embeddings for torchtext

This code snippet would assign embedding vectors to the nn.Embedding layer.
Note that nn.Embedding will already randomly initialize the weight parameter, but you can of course reassign it.

You could use torch.from_numpy(np.random.rand(...)).float() to avoid a copy, but your code should also work.

1 Like