How to aovid for-loop in torch

Hi all, I meet a simple problem in torch.
I have a vector num_nodes which indicate the number of nodes on each graph, then I want to get vector batch to present which graph each node belongs to, it can be written in for-loop easily but has high latency, is there any operation I can use to vectorize it?

# input
# num_nodes: [2,3]
for i, num in enumerate(num_nodes):
        batch[cum : cum + num] = i
        cum += num
# output
# batch: [0, 0, 1, 1, 1]

Any advice from you would be appreciated! :grinning:

This should be a bit faster:

def build_nodes_list(num_nodes:list):
    return torch.cat([torch.full(num, i) for i, num in enumerate(num_nodes)])

Hi Jiahuil!

Here’s a loop-free version:

>>> import torch
>>> torch.__version__
'1.13.1'
>>> num_nodes = torch.tensor ([2, 3, 2])
>>> batch = torch.zeros (num_nodes.sum(), dtype = torch.long)
>>> batch[num_nodes[:-1].cumsum (0)] = 1
>>> batch = batch.cumsum (0)
>>> batch
tensor([0, 0, 1, 1, 1, 2, 2])

Best.

K. Frank

1 Like