Confusion Matrix - wrong calcualtions?

Hey,

I use torcheval for CM

model.eval()
f1 = BinaryF1Score()
bcm = BinaryConfusionMatrix()
with torch.no_grad():
    correct = 0
    total = 0
    for images, labels in test_loader:
        images = images.to(device)
        labels = labels.to(device)
        outputs = model(images)
        accuracy.update(outputs.squeeze(1), labels)
        f1.update(outputs.squeeze(1), labels)
        bcm.update(outputs.squeeze(1), labels)
    accc = accuracy.compute()
    f1_score = f1.compute()
    bcm = bcm.compute()

    print(f"Test accuracy = {accc}")
    print(f"F1 score = {f1_score}")
    print(f"Confusion Matrix = {bcm}")

results

Test accuracy = 0.921103835105896
F1 score = 0.8769050240516663
Confusion Matrix = tensor([[145.,  89.],
        [ 16., 374.]])

buttt if you add up TP + FN this is no way correct for 91%

accuracy is not initialized in your code. I guess you used it already in a earlier loop and the results get mixed with the new ones.

accuracy is initialzied before training loop

but my quesition is about CM

Hi Bro
can you provide use the full code s we can test with u and try to find solution ?

What’s exactly your question if it’s only about CM and not about the accuracy ?

hey

we can close it as i figure it out

i swtiched to Lighting metrics

acc = BinaryAccuracy().to(device)(preds, labels)
precision = BinaryPrecision().to(device)(preds, labels)
recall = BinaryRecall().to(device)(preds, labels)
cm = BinaryConfusionMatrix().to(device)(preds, labels)

print(f"Val Accuracy: {acc}“)
print(f"Val Precision: {precision}”)
print(f"Val Recall: {recall}“)
print(f"Confusion Matrix:\n {cm}”)

Val Accuracy: 0.8060897588729858
Val Precision: 0.7695390582084656
Val Recall: 0.9846153855323792
Confusion Matrix:
tensor([[119, 115],
[ 6, 384]], device=‘cuda:0’)