Insert a tensor into another tensor with different location depending on the index

I want to insert a small tensor B into another larger tensor A but the insert location changes over an index.(I’m not sure how to explain exactly…)
For example,

``````a = torch.zeros((2,8,10))
b = torch.rand((2,8,2))
for i in range(8):
a[:,i,i+0:i+2] = b[:,i,:]
``````

Suppose the first dimension is batch dimension and the second dimension is time. Then I want to insert `b` into `a` but the location in third dimension changes over time. The location is not necessarily depending on the time.

Is there any tensor operation that can replace the loop which makes it faster in GPU?