Is there any advange of loading my fasttext embeddings into a nn.Embedding layer (which I keep freezed, btw) instead of just using the fasttext model to get the word vectors?
I mean, the big advantage of fasttext is that its ability to create an embedding of an OOV-word based on its character n-grams. If I use an Embedding layer (and not fine tune it) I am losing that point.