Tensor - read and write values across batches and channels

I have a batched tensor [B, C, H, W], where B is batch, C is a number of channels, H and W are dimensions. I have an array with indices and I would like to read values at those indices from input tensor and write them to another tensor.

For example:

two tensors A (source) and B (target)

B = 1
C = 3
H, W = 64

indicesRead = [[0, 0], [13, 15], [32, 43]]
indicesWrite = [[7, 5], [1, 1], [4, 4]]

and I would like to get from tensor A value for channel 0 at [0, 0], channel 1 at [13, 15] and channel 2 at [32, 43].

Once I have these values, I want to write them to tensor B to channel 0 to position [7, 5] (in this sample copy A[0, 0, 0, 0] to B[0, 0, 7, 5]) etc.

Can it be done with torch methods or I have to iterate tensor manually?

Simple 3x3 2-channel examle (1 batch only):

Anp = np.array([[[[0, 1, 2], [3, 4, 5], [6, 7, 8]], [[0, 11, 22], [33, 44, 55], [66, 77, 88]]]])

A = torch.as_tensor(Anp, dtype=torch.float)
B = torch.empty_like(A)

indicesRead = [[[0, 1], [1, 2]]]
indicesWrite = [[[1, 1], [2, 2]]]

#what next?

Note: read and write indices can be possibly reoriented if needed, they donw have to be in this format.

A direct assignment might work:

B = 1
C = 3
H, W = 64, 64

indicesRead = torch.tensor([[0, 0], [13, 15], [32, 43]])
indicesWrite = torch.tensor([[7, 5], [1, 1], [4, 4]])

a = torch.randn(B, C, H, W)
b = torch.zeros(B, C, H, W)

b[:, torch.arange(C).unsqueeze(1), indicesWrite[:, 0], indicesWrite[:, 1]] = a[:, torch.arange(C).unsqueeze(1), indicesRead[:, 0], indicesRead[:, 1]]

Works great, thank you