I’m currently trying to solve an object detection problem and decided to use faster RCNN on for this. I followed this Youtube video and their Code. The loss decreases but the big problem is it won’t evaluate correctly no matter how I try to. I’ve tried looking into the inputs, if there is any sort of size mismatch or missing information but it still doesn’t work. It’s always showing -1 and 0 values for all of its metrics like this
creating index...
index created!
Test: [0/1] eta: 0:00:08 model_time: 0.4803 (0.4803) evaluator_time: 0.0304 (0.0304) time: 8.4784 data: 7.9563 max mem: 7653
Test: Total time: 0:00:08 (8.6452 s / it)
Averaged stats: model_time: 0.4803 (0.4803) evaluator_time: 0.0304 (0.0304)
Accumulating evaluation results...
DONE (t=0.01s).
IoU metric: bbox
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.000
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.000
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.000
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1.000
Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = -1.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1.000
Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = -1.000
<coco_eval.CocoEvaluator at 0x7ff9989fea10>