Hi all,

I have an input to my NN consisting of the tensor(u) which stores the gradient needed for backpropagation in my example its `grad_fn<AddBackward0>`

and numpy.array(y).

The NN model takes an input that’s a tensor, so I convert the array to tensor and concatenate them together. z

I feed this concatenated tensor into the NN model. It is important to note that the `u`

part of the input stores gradient and the part corresponding to y doesn’t (it originates from numpy.array()).

After I check the type of the output of the NN I see that it is a tensor with a `grad_fn=<CatBackward>`

What is the meaning of the changed name of grad_fn?

Does it mean that the gradient has been overwritten?

If I run backpropagation through the model will it be able to use the gradient stored in u to do it successfully?

If we want to backpropagate just through u (even though input is u and y) is it possible?

Thanks in advance