[SOLVED] Backpropagation of Concatenated L2 Distances Error

Hello, I am having issues during the loss.backward() step in my implementations. Specifically, a submodule of my implementation is essentially an L2 distance calculation of my input (of size 128x46x46) with a specific set of vectors. The result is the concatenated tensor of these distances. The code is presented below:

class CentroidDistances(nn.Module):
"""Class that extends the Module class to create a custom layer for our final model"""
def __init__(self, centroids):
    super(CentroidDistances, self).__init__()
    self.centroids = centroids

def forward(self, x):
    distance_list = []
    for i in self.centroids:
         distance_list.append(torch.sum(torch.pow(x.sub(i.expand_as(x)), 2), 1))
    result = torch.cat(distance_list, dim=0)
    return result

The problem is that during the backward step, this code produces an Error:
RuntimeError: size ‘[1 x 46 x 46]’ is invalid for input of with 270848 elements

Does anyone have an idea of how to circumvent this problem? Because according to other topics, the torch.cat() function can be backpropagated. Thank you!

Your code works for me:

>>> centroids = [Variable(torch.rand(46,46)) for i in range(10)]
>>> net = CentroidDistances(centroids)
>>> x = Variable(torch.rand(128,46,46), requires_grad=True)
>>> y = net(x)
>>> z = y.sum()
>>> z.backward()
>>> x.grad
Variable containing:
( 0 ,.,.) = 
  1.2523e+00  1.1620e+00 -3.7774e+00  ...   4.9212e-01 -7.9828e+00 -6.6876e+00
  ...

Which version of Pytorch are you using ?

Thank you for your response. The problem was that anaconda was identifying an older installation. It works now. Thank you very much!

1 Like