Slicing tensor of different size

Hi. I’m trying to slice a tensor of different sizes for each batch. I have two tensors which are including start and end index respectively. For example, for a tensor of size [3, 1, 8], I want only specific tensors having different range:

start = torch.tensor([1, 3, 1])
end = torch.tensor([7, 7, 5])

# Do slicing

tensor([[[ 0,  1,  2,  3,  4,  5,  6,  7]], # [1, 7]

        [[ 8,  9, 10, 11, 12, 13, 14, 15]], # [3, 7]

        [[16, 17, 18, 19, 20, 21, 22, 23]]]) # [1, 5]

So, the first one will be [1, 2, 3, 4, 5, 6], the second one will be [11, 12, 13, 14] and the last one will be [17, 18, 19, 20]. I used iteration, but it is too slow and not elegant. How can I get them more faster and more elegant than just iterative way? The below shows my implementation.

context = torch.zeros(batch_size, 1, self.hid_dim).to(device)

for i in range(batch_size):
  local = torch.tensor([j for j in range(start[i], end[i])]).to(device) # [2D + 1]: for j
  batch_Pt = Pt[i]
  local = torch.exp(-1 * ((local-batch_Pt) ** 2)/((self.D ** 2) / 2)) # [2D + 1]
  score = align_score[i] * local  # [1, 2D + 1]
  context[i] = score.mm(H[i])

You can use torch.gather to accomplish this. This article will help you get started.

I don’t think it could be an answer. The point is I want to extract them, with different sizes. So the tensor of the first example will be [1, 2, 3, 4, 5, 6] and the second one will be [11, 12, 13, 14]. I guess that torch.gather only results tensors have the same size.

I’m facing the same issue here, have you found a solution?

Sorry for thr late answer. I’m afraid but I can’t find thr solution. I just used for loop

Thanks so much for your reply! I worked out something like this Scatter concatenation?, the speed is fine, but cannot be batched.