Copy a row from one tensor to another tensor row

What would be the easiest way with pytorch to say copy the 2nd row from tensor A(32,1,16,16) to the 5th row of tensor B(32,1,16,16)?

By “row” I mean an index from the first dimension. So my copied block would be of size (1,1,16,16).
By “copy” I mean overwriting the 5th row of tensor B with data from the 2nd row of tensor A.

Bonus question, is there a swap operation doing that? (otherwise 3 copies with an intermediate buffer will do the trick)

For a bit of context, the bigger picture is that I have several tensors like A (my training set sliced in batch of 32 samples (1,16,16)) and I’d like to shuffle them at each epoch, which requires swapping rows around across the different training tensors.

Found how to do it:

void Training::swapTensorRow(torch::Tensor tensorA, int indexA, torch::Tensor tensorB, int indexB)
    torch::Tensor cache=tensorA.index({indexA}).clone();

More detail about indexing PyTorch vs LibTorch:

1 Like