Logsumexp trick for log(2 / (1 + exp(x)))

I am computing the similarity between embeddings by converting the similarities to probability and using cross entropy loss.

Is there anything like the logsumexp trick I can do to avoid computing exp(x) in log(2 / (1 + exp(x)))?

I am not sure about the exact formulation of the loss function.
From what I understand, you can use logsumexp with log(2 / (1 + exp(x))) too.

log(2 / (1 + exp(x))) = log 2 - log (1 + exp(x)) = log 2 - logsumexp([0, x])

import torch
x = torch.rand(5) * 10

actual = torch.log(2 / (1 + torch.exp(x)))

# concat 0 with x
catzero = torch.cat((torch.zeros(5,1), x.view(5,1)), dim=1) # 5x2
logexptrick = torch.log(torch.tensor(2)) - torch.logsumexp(catzero, dim=1)