fasterRCNN getting both predictions and loss during evaluation

Hello,
I am using fasterRCNN and would like to evaluate my model by checking the loss, IoU, and accuracy.
When the model is on train() mode, it returns only the loss, and when on eval() mode it returns only the predictions (relevant for calculating accuracy and IoU).

I have seen few posts about it in this forum, but none have found a solution.

How can I return from the model both loss and predictions during evaluation?
Thank you

are you working with a specific model tutorial or example?

Hard to answer without knowing more details but if you’re using the example pytorch tutorial I believe the evaluation function uses a inference (predictions) step then takes your inference and runs it by some COCO evaluation metrics…

I have my best attempt at splitting up the two from one another in this post

Thank you for your reply @emcp !

I am using something like that: Object Detection with Faster RCNN | by Arun Prakash | Francium Tech
This is the training part from the tutorial:

My problem is that the model returns either the loss or the predictions so I want to find a way to get both for evaluation without iterating over my entire data twice every epoch (it takes me 24 minutes for one iteration over train set- 16,000 images with batch_size=32)

I am not familiar with an evaluation function in pytorch tutorial. I would appreciate if you can direct me to it if it is relevant for what I want

Thank you!

I am not familiar with an evaluation function in pytorch tutorial. I would appreciate if you can direct me to it if it is relevant for what I want

I thought you’d been looking at the pytorch torchvision tutorial and it’s source code

https://pytorch.org/tutorials/intermediate/torchvision_tutorial.html

The main training function

The train_one_epoch piece

Thank you @emcp
Sorry, I did see this tutorial. My training procedure is similar to this one.
However, in this tutorial they don’t calculate both loss and IoU for evaluation- only IoU.
I have also looked at their evaluate() function which runs over test_loader, but it only calculates IoU and AP, not loss.
I want to also calculate the loss on the test_loader in order to make sure that I don’t overfit over train_loader while training.

Thank you

In my opinion, you maybe copy the loss related code from train function to evaluate function.