Does this torch implementation of tf.scan break the backporp?

I have written the following code to mimic tf.scan

    def scan(foo, x):
        res = []
        res.append(x[0].unsqueeze(0))
        a_ = x[0].clone()

        for i in range(1, len(x)):
            res.append(foo(a_, x[i]).unsqueeze(0))
            a_ = foo(a_, x[i])

        return torch.cat(res)

It generates the desired output for a number of examples. My only question is if the append and torch.cat part in this work break the backpropagation computations.

1 Like

Hi Blade!

No, backpropagation will not be broken by append(). Even though
you are appending to a python list, you’re still (presumably) appending
a valid pytorch tensor, and cat() is a valid pytorch tensor operation.
(Of course, something in foo() might break backpropagation.)

Here is a simple script that illustrates backpropagating through
append() and cat():

import torch
torch.__version__

t1 = torch.autograd.Variable (torch.FloatTensor ([1.0]), requires_grad = True)
t2 = torch.autograd.Variable (torch.FloatTensor ([2.0]), requires_grad = True)
l = []
l.append (t1)
l.append (t2)
t = torch.cat (l)
t.prod().backward()
t1.grad
t2.grad

And here is its output:

>>> import torch
>>> torch.__version__
'0.3.0b0+591e73e'
>>>
>>> t1 = torch.autograd.Variable (torch.FloatTensor ([1.0]), requires_grad = True)
>>> t2 = torch.autograd.Variable (torch.FloatTensor ([2.0]), requires_grad = True)
>>> l = []
>>> l.append (t1)
>>> l.append (t2)
>>> t = torch.cat (l)
>>> t.prod().backward()
>>> t1.grad
Variable containing:
 2
[torch.FloatTensor of size 1]

>>> t2.grad
Variable containing:
 1
[torch.FloatTensor of size 1]

Best.

K. Frank