# Repeating vector entires an unequal number of times in pytorch?

I know torch.repeat is similar to np.tile but is there a way in pytorch to reproduce the following numpy behavior

a=np.array([1,2,3,10])
print(np.repeat(a, [4,3,4,5], axis=0))

which outputs

[ 1 1 1 1 2 2 2 3 3 3 3 10 10 10 10 10]

I’ve looked online regarding this but I couldn’t find an adequate solution, more explicitly (although the np example is pretty obvious) I want to do the following:

given a tensor a=[1,2,3,10]

I want replicate each entry a certain number of times as defined by a second tensor, for instance, if
rep=[4,3,4,5] then the result of the operation would be the following tensor

new_a=[1,1,1,1,2,2,2,3,3,3,3,10,10,10,10,10]

I know it is possible using the “gather” command, for example

IND=torch.tensor([0,0,0,0,1,1,1,2,2,2,2,3,3,3,3,3])
new_a=a.gather(0,IND)

but the problem with this is that IND has to be a LongTensor and for the size of my problem it ends up being around 8GB in the GPU RAM which is prohibitive, most of the entires in IND are redundant anyway which makes this method quite inefficient.

Any help would be appreciated!!

As a workaround you could perform the repetition in a loop:

``````new_a = []
for a_, size in zip(a, [4,3,4,5]):
new_a.append(a_.repeat(size))
new_a = torch.cat(new_a)
``````

This might not be as performant as your code but will save some memory for the indices.