Torchmetrics problem, conf matrix and presicion recall acc f1 does not proper

Hi, the evalation outputs listed below:
Precision: tensor(0.6233)
Recall: tensor(0.7200)
F1 score: tensor(0.6577)
accuracy: tensor(0.7200)
conf : tensor([[242, 0],
[ 58, 0]])

Is that possible? can be the reason The average method for presicion recall… causes this?

Here is definitions of metrics:

self.f1 = F1Score(num_class,average = "macro")
self.presicion = Precision(num_class,average = "macro")
self.recall = Recall(num_class, average = "macro")
self.accuracy = Accuracy(num_classes = num_class, average = "macro")
self.conf = ConfusionMatrix(num_classes = num_class,normalize = "none")

and calculation:

for data, target in testloader:
            data, target = data, target
            with torch.no_grad():
                output = self(data)
            _,output = torch.max(output,1)
            test_score_conf = test_score_conf + self.conf(output,target)
            test_score_f1 = test_score_f1 + self.f1(output,target)
            test_score_accuracy = test_score_accuracy + self.accuracy(output,target)
            test_score_recall = test_score_recall + self.recall(output,target)
            test_score_presicion = test_score_presicion + self.presicion(output,target)
        
        test_score_f1 = data.shape[0]*test_score_f1/len(test_dataset)
        test_score_accuracy = data.shape[0]*test_score_accuracy/len(test_dataset)
        test_score_recall = data.shape[0]*test_score_recall/len(test_dataset)
        test_score_presicion = data.shape[0]*test_score_presicion/len(test_dataset)
        return test_score_f1,test_score_accuracy,test_score_recall,test_score_presicion,test_score_conf