Hi there,
I have a list of different size tensors, and I want to concat some of them using indices without using for-loop.
Here is an example to do it using a loop:
item_features = [
torch.LongTensor([0]),
torch.LongTensor([1, 10]),
torch.LongTensor([2, 27, 31]),
torch.LongTensor([3]),
torch.LongTensor([4]),
torch.LongTensor([5, 11]),
]
item_ids = torch.LongTensor([1, 5])
result = torch.cat([
item_features[item_id]
for item_id in item_ids
])
result # tensor([ 1, 10, 5, 11])
Also I tried to do it by storing the tensors as a sparse csr matrix:
item_features = torch.LongTensor([0, 1, 10, 2, 27, 31, 3, 4, 5, 11])
item_indices = torch.LongTensor([0, 1, 3, 6, 7, 8, 10])
item_ids = torch.LongTensor([1, 5])
result = torch.cat([
item_features[item_indices[item_id]: item_indices[item_id + 1]]
for item_id in item_ids
])
result # tensor([ 1, 10, 5, 11])
But there is a problem of multiple slicing, which is similar to this topic: How to slice multiple interval over tensor.
So, how can I do this in an efficient tensor-like way?