Will torch.stack() for autograd.Variable influence process of back propogation?

In my code, I need to use stack operation as follows:

mmtx1 = torch.stack([mtx1]*m)

where mtx1 is a torch.autograd.Variable with shape (n, d)

The loss is calculated using mmtx1, my question is that will this operation influence the autograd process considering that mtx1 is duplicated m times .

Depends on what you mean by “influence”. Of course the autograd will be impacted by stack as it is part of the computation graph. But the gradients will be correct.

1 Like

Thanks for your quick reply.
I just want to make sure the gradient is correctly calculated.
Your answer is exactly what I want.