Differentiable way to assign an array/tensor to another tensor

Hello,
I’m not skillful in PyTorch so I’m sorry if I’m asking a weird question.

I have a tensor of zeros (say A) and an array that has a gradient (say B) and am trying to assign B to A while keeping the gradient.
Then A is used as the input to the subsequent MLP (say MLP1) and loss is calculated.

Here’s a sample code

import torch
A = torch.zeros(3, 4, 5)
B = torch.arange(10, dtype=float, requires_grad=True).reshape(2, 5)
A[0, :2, :] = B # Does this operation keep gradient?
# A is used as an input to a subsequent network, e.g., MLP, and loss is calculated

where B is actually outputted from another MLP (say MLP2) and thus I’m expecting the back-propagation eventually update the weights of both the MLP1 and MLP2.

Does this operation work as I expect?

Thank you in advance!

I’m sorry, I have confirmed by myself that the model’s weight is updated.