yes you can do that,
just add extras embedding of suitable dimension,
embedding = nn.Embedding(vocab_size, dim),
embedding.shape = (batch_size, seq_len, dims)
extended_dim = (batch_size, seq_len, extended_dims)
final = torch.cat([embedding, extended],2)]
the final is the vector with embeddings of name_entity as well.