Automatic differentation and Slicing

Hello,

lets say my network produces an output x as an 1D-tensor with [1,5,2,10]. If I slice it into two different 1D-tensors like a = x[0:1], b=x[2:3] and put them in a loss function. Will the automatic differentiation algorithm be able to do the differentiation in my network? how should the automatic differentiation algorithm know that these tensors are from my network?

Yes, autograd is still able to construct the graph in this case, as in the case with any viewing and reshaping operation. Autograd is able to do this because slicing is overloaded by pytorch, so any call to slice will have autograd logic interposed to build the backward graph.

Can I autograd some Tensor automatically, if it can’t reccronstruct the graph?

Not completely sure what you mean by that, but maybe you are looking for Automatic differentiation package - torch.autograd — PyTorch 2.0 documentation?