Hi,
Does sampled softmax loss exist in pytorch? I cound not find it.
Thanks
Hi,
Does sampled softmax loss exist in pytorch? I cound not find it.
Thanks
can you elaborate on what this is? with a link to an implementation elsewhere?
afaik this is not in pytorch core.
Thanks for your reply.
Tensorflow has this: https://www.tensorflow.org/api_docs/python/nn/candidate_sampling
We don’t have any such thing in the core. We’ll need to add them. Thanks for the pointer!
There’s a lua torch implementation
@ngimel: Thanks for the link. I don’t know Lua and will have a look at the code.
As far as I know, NCE (Noise Contrast Estimation) is different from sampled softmax from tensorflow, see Jozefowicz et al. (2016) or here for a comparison.
EDIT: sorry, I see that original link is to a page with a number of different softmax approximations, and NCE is one of them. I personally would be more interested in sampled softmax, as it tends to work better for me.
EDIT2: here is a TF implementation of sampled softmax and NCE, hopefully they can be implemented using existing pytorch functions.
You may also be interested in this implementation:
Giving very good results for the LM task.
Any updates on this?? It probably isn’t a priority…but I secretly wish PyTorch has a big development team like TensorFlow and can add these functionalities easily!
@windweller Adaptive Softmax was part of PyTorch 0.4.1, see: https://pytorch.org/docs/stable/nn.html#torch.nn.AdaptiveLogSoftmaxWithLoss
Sampled Softmax is implemented in this repo: https://github.com/rdspring1/PyTorch_GBW_LM
Nice and incredible!! Will start using it now