Changing size of network tensor during training?


I’m wondering if I can change the size of a tensor/replace it with a copy of itself + 1 row or something during training of a pytorch network? I’d be doing this update after the backprop has happened and before the forward prop for the next epoch has happened, so presumably the computation graph wouldn’t care about this?