Hi, I have two batches, containing logits produced by classifiers. I like to produce another batch with the same size, but with harmonic mean of each pair of values.

```
The batch_A = tensor([[[[-0.39453]]],
[[[-0.39018]]],
[[[ 0.01991]]],
[[[-0.40854]]],
[[[-0.39593]]],
[[[ 0.21211]]]], device='cuda:0', grad_fn=<CudnnConvolutionBackward>)
The batch_B = tensor([[-0.00552],
[ 0.26541],
[ 1.66024],
[-0.33280],
[ 0.35407],
[ 0.55062]], device='cuda:0', grad_fn=<AddmmBackward>)
I like to have a new batch with harmonic mean of batch A and B, but the grad_fn=<AddmmBackward> or identical to batch_B
```

I first tried to just sum up two batches, it kinda worked, but I got grad_fn=CopySlices. by below code:

```
for ele in range(len(batch_B)): batch_B[ele] = batch_B[ele] + batch_A [ele]
```

I tried to do similar but do harmoni mean of each element pairs

```
for ele in range(len(batch_B)): batch_B[ele] = (( batch_B[ele]*batch_A [ele])*2)/( batch_B[ele]+batch_A [ele])
```

But I got the error below:

**RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [1]], which is output 0 of SelectBackward, is at version 32; expected version 31 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).**

Really appreciate any help.