Softmax VS. Negative Sampling

Dear Community,

eventually Negative Sampling was introduced as substitute for the expensive Softmax Function as part of the word2vec model (

One questions though, if we had unlimited processing power would be still use Negative Sampling?
Is it better to only update a few weights (NS) compared to all the weights (Softmax)?

Would love to hear your thoughts.