I made torch.nn.module layer
which modifies the axis of input tensors with swapaxis.
What I want to know is, when the backpropagation of loss is performed,
"the axis of loss is also automatically swapped? "
I wonder, when I modify some axis of input tensor, the loss is somewhat increased and not reducing, So I want to check this issue. Thanks.