Backward tracking of graph

I am trying to implement Row LSTM RNN paper where Matrix multiplication in RNN’s is replaced by conv1d. So I am using two for loops one for handling sequential feed and other for RNN layers. So, I really want to check that does the loss.backward() traverse back through all the loops. But there is no ‘previous_functions’ for conv1d. Please guide.

Thank you,

<torch.autograd._functions.basic_ops.Add object at 0x120990138>
<torch.autograd._functions.basic_ops.Add object at 0x120981b30>
<torch.autograd._functions.basic_ops.Add object at 0x1209815c0>
<torch.autograd._functions.basic_ops.Add object at 0x120981050>
<torch.autograd._functions.basic_ops.Add object at 0x120978960>
<torch.autograd._functions.basic_ops.Add object at 0x1209783f0>
<torch.autograd._functions.basic_ops.Add object at 0x120985de8>
<torch.autograd._functions.basic_ops.Add object at 0x120985878>
<torch.autograd._functions.basic_ops.Add object at 0x10963e050>
<torch.autograd._functions.basic_ops.Add object at 0x10965ea48>
<torch.autograd._functions.basic_ops.Add object at 0x10965e4d8>
<torch.autograd._functions.basic_ops.Add object at 0x109670ed0>
<torch.autograd._functions.basic_ops.Add object at 0x1119d56a8>
<torch.autograd._functions.basic_ops.Add object at 0x1119d5138>
<torch.autograd._functions.basic_ops.Add object at 0x1119d2b30>
<torch.autograd._functions.basic_ops.AddConstant object at 0x1119d25c0>
<torch.nn._functions.thnn.auto.NLLLoss object at 0x1119d24d8>
<torch.nn._functions.thnn.auto.LogSoftmax object at 0x1119d23f0>
<torch.autograd._functions.tensor.Index object at 0x1119d2138>
<torch.autograd._functions.tensor.Permute object at 0x1119d2050>
<torch.autograd._functions.basic_ops.Mul object at 0x120993ed0>
<torch.nn._functions.thnn.auto.Sigmoid object at 0x120993a48>
<torch.autograd._functions.basic_ops.Add object at 0x120993960>
<ConvNdBackward object at 0x10966c9d0>
[‘call’, ‘class’, ‘delattr’, ‘doc’, ‘format’, ‘getattribute’, ‘hash’, ‘init’, ‘new’, ‘reduce’, ‘reduce_ex’, ‘repr’, ‘setattr’, ‘sizeof’, ‘str’, ‘subclasshook’, ‘_register_hook_dict’, ‘register_hook’]

What pytorch version are you using, does the variable has next_functions attribute?

print torch.version gives --> 0.1.12_2, Is this the latest stable one or I should go for bleeding edge?

previous_functions seems to be removed in the future, in my recent project, I use next_functions to back tracing, which works well in bleeding edge version.

here is how I do back tracing recursively:
https://github.com/lanpa/tensorboard-pytorch/blob/master/tensorboard/graph.py