Strange values confusion matrix evaluation

Hello, iā€™m doing the evaluation of and custom model in the following way:

BATCH_SIZE = 20
image_dataset = datasets.ImageFolder(data_dir, image_transform)
dataloader = torch.utils.data.DataLoader(image_dataset, batch_size=BATCH_SIZE, shuffle=True)
print('classes_idx: {}'.format(image_dataset.class_to_idx))

confusion_matrix = torch.zeros(NUM_CLASSES, NUM_CLASSES)

with torch.no_grad():
	for idx, (inputs, classes) in enumerate(tqdm(dataloader)):
		inputs = inputs.to(DEVICE)
		classes = classes.to(DEVICE)
		outputs, _, _, _ = model(inputs)
		_, preds = torch.max(outputs, 1)
		
		for t, p in zip(classes.view(-1), preds.view(-1)):
			confusion_matrix[t.long(), p.long()] += 1

		print(confusion_matrix)

But after some batches this happens (im printing confusion matrix for each batch):

tensor([[ 278.,  125.,   36.], # (CM BATCH_N = N)
        [   0.,  728.,   36.],
        [ 107.,  134., 1476.]])
 20%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–‰                                                                                                          | 146/743 [09:00<34:20,  3.45s/it]
tensor([[2.7900e+02, 1.2800e+02, 3.6000e+01], # (CM BATCH_N = N + 1)
        [1.0000e+00, 7.3100e+02, 3.6000e+01],
        [1.0900e+02, 1.3400e+02, 1.4860e+03]])

Anyone knows why the values are changing in this way?

The integer values are just being displayed in the exponential form (scientific notation).
You can disable it as shown here:

x = torch.tensor([[2.7900e+02, 1.2800e+02, 3.6000e+01], 
                  [1.0000e+00, 7.3100e+02, 3.6000e+01],
                  [1.0900e+02, 1.3400e+02, 1.4860e+03]])

print(x)
torch.set_printoptions(sci_mode=False)
print(x)
1 Like