Loss and accuracy (Newbie)

Hello everyone ,
I want to get the test loss and accuracy for every image in every batch and its index, not accuracy and loss for the hole bach. i.e : dictionary that store every single image tested label, loss, index.
Thanks in advance

1 Like

You could get the sample-wise losses by setting reduction='none in the initialization of your criterion.
I’m not sure, which use case you are working ob, but a accuracy for each image can be calculated regardless of the loss reduction.

2 Likes

Here is actually where i wanna get the individual class precisions :

def test(epoch):
    model.eval()
    test_loss = AverageMeter()
    acc = AverageMeter()
    for data, target in test_loader:
        if use_cuda:
            data, target = data.cuda(), target.cuda()
        data, target = Variable(data, volatile=True), Variable(target)
        data = data.float()
        output = model(data)
        entr = F.cross_entropy(output, torch.max(target, 1)[0], size_average=True).data[0]
        test_loss.update(entr,target.data.size(0))
        prec1, = accuracy(output.data[i], target.data[i]) # test precison in one batch
        acc.update(prec1, target.data.size(0))
    print('\nTest set: Average loss: {:.4f}, Accuracy: {:.2f}%\n'.format(test_loss.avg, acc.avg))
    writer.add_scalar('Loss/Test', test_loss.avg, epoch)
    writer.add_scalar('Accuracy/Test', acc.avg, epoch)
    return acc.avg

Want to have something like this :

class 1 92% 
class 2 93%
....
class i  xx%

I would recommend to store all predictions and targets in a list (don’t forget to wrap this code in a with torch.no_grad() block or detach the predicitons) inside the loop.
Once you’ve collected all predictions and the corresponding target, you could use sklearn.metrics.confusion_matrix (or search here in the board for a PyTorch implementation we’ve posted some time ago).
To get the per-class accs, you could use:

conf = confusion_matrix(targets, preds)
conf = conf / conf.sum(axis=1)[:, np:newaxis]
print(conf.diagonal())
2 Likes