Word2Vec as input to lstm

Thank you Chris for the help especially the requires_grad part.
So I have made some changes and these are the steps I followed:
1: model.save('w2v.model') # which persists the word2vec model I created using gensim
2: model = Word2Vec.load('w2v.model') # loading the model
3:

 weights = torch.FloatTensor(model.wv.vectors)
embedding = nn.Embedding.from_pretrained(weights)

Does these steps seem correct(I haven’t added the requires_grad yet)? The w2v dimension is 200

1 Like