Indexing Tensor with diffrent indx

I would like to see if there is an elegant solution to this problem.

import torch
xb1 = torch.randn(64, 3, 128, 128)
xb2 = torch.randn(64, 3, 128, 128)

x1 = torch.randint(0, 128, (64,))
x2 = torch.randint(0, 128, (64,))
y1 = torch.randint(0, 128, (64,))
y2 = torch.randint(0, 128, (64,))

#this works fine
for i in range(len(x1)):
    xb2[i, :, x1[i]:x2[i], y1[i]:y2[i]] = xb1[i, :, x1[i]:x2[i], y1[i]:y2[i]]

#indexing directly thrus error
xb2[:, :, x1:x2, y1:y2] = xb1[:, :, x1:x2, y1:y2]

This wil thru error:

TypeError                                 Traceback (most recent call last)
<ipython-input-11-88c5a959f6b1> in <module>
     13 x1.shape
     14 
---> 15 xb1[:, :, x1:x2, y1:y2] = xb2[:, :, x1:x2, y1:y2]

TypeError: only integer tensors of a single element can be converted to an index

I was able to solve this problem using for loop. Was wondering if torch has better ways to perform this operation

As per the error trace, you cannot index using a tensor of size greater than 1. In other words, you cannot use a multidimensional array to index out elements. It is not very clear to me as to what you are trying to do. Can you print out the for loop that you used? That would give a better idea of what you want to achieve.

I have edited my question with the for loop example

x1[i], x2[i] are completely random values right. To the best of my knowledge, I don’t think there is an elegant way to do this. Anyways, i will try and let you know.

Thank you =) yes, we are trying to implement cutmix… you randomly cut small image patch from one image and paste to another… For loop works fine but I was wondering if there is a way to speed this up.