how can I insert a Tensor into another Tensor in pytorch

I have pytorch Tensor with shape (batch_size, step, vec_size), for example, a Tensor(32, 64, 128), let’s call it A.

I have another Tensor(batch_size, vec_size), e.g. Tensor(32, 128), let’s call it B.

I want to insert B into a certain position at axis 1 of A. The insert positions are given in a Tensor(batch_size), named P.

I understand there is no Empty tensor(like an empty list) in pytorch, so, I initialize A as zeros, and add B at a certain position at axis 1 of A.

A = Variable(torch.zeros(batch_size, step, vec_size))

What I’m doing is like:

    for i in range(batch_size):
        pos = P[i]
        A[i][pos] = A[i][pos] + B[i]

But I get an Error:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

Then, I make a clone of A each inside the loop:

    for i in range(batch_size):
        A_clone =  A.clone()
        pos = P[i]
        A_clone[i][pos] = A_clone[i][pos] + B[i]

This is very slow for autograd, I wonder if there any better solutions? Thank you.

Is there a way to insert a tensor into an existing tensor? do you need something like this ?