ONNX Total Accuracy Loss

Hello,
I am currently trying to modify the output of a convolution with a forward hook. So far this works in PyTorch and I have ~67% on Cifar10 before adding the hook and ~35% after adding the hook. This are values which are fine for this testcase. But after exporting the model to onnx, the accuracy drops down to 10% being equivalent to random within Cifar10. The forward hook looks like this:

zero_tensor = torch.zeros(
        (o_tensor.shape[0], module.out_channels,
         o_tensor.shape[2], o_tensor.shape[3])
    ).to(o_tensor.device)
zero_tensor[:, module.keep_idxs.to(o_tensor.device), :, :] = o_tensor
return zero_tensor

My current guess is, that the slicing notation is not compatible with onnx exporting. Further notice is, that the keep_idxs stay constant when set once.

So my question now is: How to write this, that the functionality stays, but it is compatible with an onnx export? I have already looked into the diverse tensor methods, but I was unable to find one to recreate what I achieve with the slicing notation.

EDIT 1: Turns out the whole output after the layer is zero in the onnx. Therefore not random data, but all zeros is output.

Best regards

Well I found the solution to my hassle. In case anybode else runs into this problem, here is my solution. I replaced the
zero_tensor[:, module.keep_idxs.to(o_tensor.device), :, :] = o_tensor
line with the following:
zero_tensor.index_copy_(1, module.keep_idxs, o_tensor)

The reason for the failure is the following: The onnx exporter seems not to support colon notation. At least not in PyTorch 1.3.1. The new line is supported according to the ONNX export list and it also works now.

Best regards