How to softmax a batch tensor with variable length?

I hava a tensor like this x = torch.Tensor([[1, 2, 3, 0], [4, 5, 0, 0], [6, 7, 0, 0]]). How can I get tensor y = softmax(x, dim=1), like this y = torch.Tensor([[a, b, c, 0], [d, e, 0, 0], [f, g, 0, 0]]) ? I really appreciate it.

You may want to do a masked softmax, e.g., https://github.com/allenai/allennlp/blob/master/allennlp/nn/util.py#L216

Thanks for your advice. It’s a great approch.