Differentiation in PyTorch

Hi all,

I have been reading up on differentiation techniques in DL frameworks. I came across the terms symbol-to-number differentiation which is used by Caffe and symbol-to-symbol differentiation. The former describes the approach of setting actual numeric values to the computational graph and computing the derivative in terms of those numeric values. The later describes an approach of adding additional nodes to the computational graph (providing a description of the derivates) which are used to compute the gradients. This later approach is used by TensorFlow. I was wondering which approach PyTorch here takes.

Best and thank you.

Hi,

I am not familiar with these terms so I cannot really tell you directly.
In pytorch, we create a graph that contains functions to call to compute the backward.
Note that contrary to tensorflow, the graph is created dynamically during the forward. So we don’t really have a concept of “symbol” everything is already a Tensor with values in it.

1 Like

Hi,

Thank you very much for your fast response. As far as I understand your answer, this would fall into the category of symbol-to-number differentiation since the numeric values are inserted directly into the graph without adding symbolic nodes that would represent the numeric values.

I would agree yes.
What I understand is that the original Tensorflow cannot do this because they build the graph ahead of time. So there are no "number"s available yet.

1 Like

Oh, now I see what you aimed at by pointing out the difference in graph creation between PyTorch and TensorFlow. Thank you very much this helped a lot.

1 Like