self.num_ftrs = num_filters
for i in range(self.num_ftrs):
setattr(self, "fc%d" % i, nn.Linear(512, 512))
setattr(self, "fc%d" % i, nn.Linear(512, num_classes))
out = out.view(out.size(0), -1)
outputs = []
for i in range(self.num_ftrs):
out = getattr(self, "fc%d" % i)(out)
out = F.relu(out)
outputs.append(getattr(self, "fc%d" % i)(out))

What kind of error are you seeing?
Based on the provided code I would guess you might be running into a shape mismatch, as you are currently redefining fc%d in the first loop, which would create a stack of layers as:

Thank you for your reply, @ptrblck.
Yes, the error is shape mismatch. How can I solve it? I want to have a fully connected layer with Relu before each fully connected layer.

You could iterate all linear layers you wish to manipulate and replace them with your additional linear + relu and the original linear layer as an nn.Sequential container.

self.layer5 = nn.Sequential(nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512, num_classes))
self.num_ftrs = 2
for i in range(self.num_ftrs):
setattr(self, "fc%d" % i, self.layer5)

and in the forward

clf_outputs = []
for i in range(self.num_ftrs):
clf_outputs.append(getattr(self, "fc%d" % i)(out))

but the output clf_outputs[0] and clf_outputs[1] were the same. The output is different if I change it to