Implementation of Improved SemHash


Since there is already an implementation for Gumbel Softmax in the official PyTorch, I wonder if PyTorch also has the implementation of Improved SemHash ([1801.09797] Discrete Autoencoders for Sequence Models), like the semhash function in TensorFlow (tensor2tensor/ at master · tensorflow/tensor2tensor · GitHub), which is an alternative of Gumbel Softmax. I’d truly appreciate it if anyone could help. Thanks so much!