I am also quite confused with torch implementation of `tf.gather_nd`

at first, but I found `tf.gather_nd`

seems like a variant of slicing. I have my implementation below and hope it will help someone

```
def gather_nd(params, indices):
"""params is of "n" dimensions and has size [x1, x2, x3, ..., xn], indices is of 2 dimensions and has size [num_samples, m] (m <= n)"""
assert type(indices) == torch.Tensor
return params[indices.transpose(0,1).long().numpy().tolist()]
```

## testing

Tensorflow samples

```
indices = [[1]]
params = [[['a0', 'b0'], ['c0', 'd0']],
[['a1', 'b1'], ['c1', 'd1']]]
output = [[['a1', 'b1'], ['c1', 'd1']]]
indices = [[0, 1], [1, 0]]
params = [[['a0', 'b0'], ['c0', 'd0']],
[['a1', 'b1'], ['c1', 'd1']]]
output = [['c0', 'd0'], ['a1', 'b1']]
indices = [[0, 0, 1], [1, 0, 1]]
params = [[['a0', 'b0'], ['c0', 'd0']],
[['a1', 'b1'], ['c1', 'd1']]]
output = ['b0', 'b1']
```

for torch

```
>>> params = torch.rand(2,2,2)
>>> params
tensor([[[0.4685, 0.2514],
[0.0624, 0.0797]],
[[0.4989, 0.1414],
[0.6970, 0.6825]]])
>>> indices = torch.Tensor([[1]])
>>> gather_nd(params, indices)
tensor([[[0.4989, 0.1414],
[0.6970, 0.6825]]])
>>> indices = torch.Tensor([[0, 1], [1, 0]])
>>> gather_nd(params, indices)
tensor([[0.0624, 0.0797],
[0.4989, 0.1414]])
>>> indices = torch.Tensor([[0, 0, 1], [1, 0, 1]])
>>> gather_nd(params, indices)
tensor([0.2514, 0.1414])
```