Hi there,

I’m building a convolutional graph network for a research project and I have been running into this bug for the past few hours and I think I’m just making an implementation mistake. I basically compute a recursive polynomial to filter my graph and here is how it is currently implemented.

```
def chebychev(self, K, x_0, Fout, weights, bias = None, biasEnable = False):
N, Fin = x_0.shape #N x Fin
x = x_0.t().expand(K,-1,-1) # K x Fin x N
# print(x.shape)
for i in range(Fin):
if K > 1:
# print(x_0[:,i])
x1 = t.sparse.mm(self.L, x_0[:,i].clone().expand(1,-1).t())
x[1,i] = x1.clone().t()
x0 = x_0[:,i].clone().expand(1,-1).t()
for k in range(2, K):
x2 = 2 * t.sparse.mm(self.L, x1) - x0 # 1 x N
x[k,i] = x2.clone().t()
x0, x1 = x1, x2
x = x.transpose(2,1).reshape((N, Fin*K))#x.transpose(0,1) # N x Fin*k
# print(x.shape)
GConv = x@weights #here weights is Fin*k x Fout so GConv is N x Fout
#add bias as necessary
if biasEnable == True: return GConv + bias #add bias where necessary
else: return GConv
```

Here K is the order of the polynomial, Fin is the input features, Fout is the output features and N is the order of the input data. When I run this I get a really scrambled image even for Fin = Fout = 1. I found that the following slight modification to the code produces the correct result:

```
def chebychev(self, K, x_0, Fout, weights, bias = None, biasEnable = False):
N, Fin = x_0.shape #N x Fin
x = x_0.t().expand(K,-1) # K x Fin x N
# print(x.shape)
for i in range(Fin):
if K > 1:
# print(x_0[:,i])
x1 = t.sparse.mm(self.L, x_0[:,i].clone().expand(1,-1).t())
x[1] = x1.clone().t()
x0 = x_0[:,i].clone().expand(1,-1).t()
for k in range(2, K):
x2 = 2 * t.sparse.mm(self.L, x1) - x0 # 1 x N
x[k] = x2.clone().t()
x0, x1 = x1, x2
x = x.transpose(0,1).reshape((N, Fin*K))#x.transpose(0,1) # N x Fin*k
# print(x.shape)
GConv = x@weights #here weights is Fin*k x Fout so GConv is N x Fout
#add bias as necessary
if biasEnable == True: return GConv + bias #add bias where necessary
else: return GConv
```

Which should return the same result as the code above with Fin=Fout=1. Can someone please help me look through this or point out the issue? It is something related to adding the extra dimension to the x tensor.