Hi,

I want to stack two tensors along a dimension, but not sequentially. Rather I want to specify the particular indexing of the stacking along that dimension. As a concrete example, I will show how this should work for just two tensors A and B:

```
A=torch.tensor([[1,2,3],
[4,5,6],
[7,8,9]])
B=torch.tensor([[10,11,12],
[13,14,15],
[16,17,18],
[19,20,21]])
```

Stacking A and B along dim = 0 with this indexing for interleaving A with B:

```
stack_indexing_A = torch.tensor([1,3,6])
```

should yield this output:

```
result = torch.tensor([[10,11,12],
[1,2,3],
[13,14,15],
[4,5,6],
[16,17,18],
[19,20,21],
[7,8,9]])
```

Basically I interleave A with B in a specified index ordering for one of the two (A in this case). This needs to be done without allocating a torch.empty tensor and filling it out because both these tensors are part of the computational graph of my model and assigining a new torch empty tensor might cause it to be rendered out of the computation graph. I am trying to do this with torch gather but as far as I understand that only works with a single tensor. Thanks for any help!