Hi!
I have a situation where I need to flatten and concatenate the feature maps of each view from the last conv layer followed by a Linear layer.
def forward(self, views): # views = (B x n_views x C x H x W)
views = views.transpose(0, 1) # (n_views x B x C x H x W)
aggregated_view = []
for x in views: # (B x C x H x W)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
x = self.avgpool(x)
x = x.view(x.size(0), -1) # Flatten
aggregated_view.append(x)
x = torch.cat(aggregated_view, dim=1)
x = self.fc(x)
return x
My problem, is how to defined self.fc
since the number of views might vary. If it was fixed, It could be defined as
def __init__(self, n_views, n_classe):
.....
.....
self.fc = nn.Linear(512*n_views, n_classe)
I know pytorch builds the graph dynamically, Can i define the number of in_features
for a linear layer in the forward function and still have the param of this layer updated?