Hi @ari,
You can have the model returning both the last Conv layer output and the final output
def forward(self, x):
[...]
c = self.last_conv(...)
x = self.dense(x)
return c, x
Next, depending on how you calculate “loss for each particular filter” you can define your loss function and sum it with the classic output loss (as nn. CrossEntropyLoss):
def special_loss(c):
# Implements the paper algorithm
return l
loss = nn.CrossEntropyLoss()
c, output = model(input)
final_loss = loss(output, target) + special_loss(c, c_label)
final_loss.backward()
Keep in my mind that a loss function is normal PyTorch code, that returns a scalar (on which you usually call backward)