# Harmonic mean of two logits batches

Hi, I have two batches, containing logits produced by classifiers. I like to produce another batch with the same size, but with harmonic mean of each pair of values.

``````The batch_A = tensor([[[[-0.39453]]],
[[[-0.39018]]],
[[[ 0.01991]]],
[[[-0.40854]]],
[[[-0.39593]]],

The batch_B = tensor([[-0.00552],
[ 0.26541],
[ 1.66024],
[-0.33280],
[ 0.35407],

I like to have a new batch with harmonic mean of batch A and B, but the grad_fn=<AddmmBackward> or identical to batch_B
``````

I first tried to just sum up two batches, it kinda worked, but I got grad_fn=CopySlices. by below code:

``````for ele in range(len(batch_B)): batch_B[ele] =  batch_B[ele] + batch_A [ele]
``````

I tried to do similar but do harmoni mean of each element pairs

``````for ele in range(len(batch_B)): batch_B[ele] = (( batch_B[ele]*batch_A [ele])*2)/( batch_B[ele]+batch_A [ele])
``````

But I got the error below:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor ], which is output 0 of SelectBackward, is at version 32; expected version 31 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).

Really appreciate any help.

hi,

1. c = concant(a,b) → torch.cat(a,b,dim=0)
2. ((1/c).mean(dim=0))^(-1)

also

The harmonic mean is also concave, which is an even stronger property than Schur-concavity. One has to take care to only use positive numbers though, since the mean fails to be concave if negative values are used.

``````Hmean = torch.cat(batch_A ,batch_B,dim=0)
But got error below:

TypeError: cat() received an invalid combination of arguments - got (Tensor, Tensor, dim=int), but expected one of:
* (tuple of Tensors tensors, int dim, *, Tensor out)
* (tuple of Tensors tensors, name dim, *, Tensor out)
``````

Regarding negative values, I will define a function to take abs values, and return the sign after Harmonic mean operation is done. However, the error above pops up, even if I make all values positive.
``````c = torch.stack([batch_A,batch_B])