Sparse torch.topk: can hybrid sparse+dense tensors help?

When I make sure that the class dimensions is the first one and call to_sparse(1) instead of to_sparse(), the _values() still contains a lot of zeros. It seems, that PyTorch isn’t able to discover the nnz pattern autmatically, so directly using torch.sparse_coo_tensor to construct a hybrid tensor could be the way forward:

>>> a = torch.eye(3)
>>> a
tensor([[1., 0., 0.],
        [0., 1., 0.],
        [0., 0., 1.]])
>>> a.to_sparse()._values()
tensor([1., 1., 1.])
>>> a.to_sparse(1)._values()
tensor([[1., 0., 0.],
        [0., 1., 0.],
        [0., 0., 1.]])