Hello, I am having issues during the
loss.backward() step in my implementations. Specifically, a submodule of my implementation is essentially an L2 distance calculation of my input (of size 128x46x46) with a specific set of vectors. The result is the concatenated tensor of these distances. The code is presented below:
class CentroidDistances(nn.Module): """Class that extends the Module class to create a custom layer for our final model""" def __init__(self, centroids): super(CentroidDistances, self).__init__() self.centroids = centroids def forward(self, x): distance_list =  for i in self.centroids: distance_list.append(torch.sum(torch.pow(x.sub(i.expand_as(x)), 2), 1)) result = torch.cat(distance_list, dim=0) return result
The problem is that during the
backward step, this code produces an Error:
RuntimeError: size ‘[1 x 46 x 46]’ is invalid for input of with 270848 elements
Does anyone have an idea of how to circumvent this problem? Because according to other topics, the
torch.cat() function can be backpropagated. Thank you!