 # ImageNet Example Accuracy Calculation

(Rafael A Rivera Soto) #1

I was looking at the topk accuracy calculation code in the ImageNet example and I had a quick question.

``````def accuracy(output, target, topk=(1,)):
"""Computes the precision@k for the specified values of k"""
maxk = max(topk)
batch_size = target.size(0)

_, pred = output.topk(maxk, 1, True, True)
pred = pred.t()
correct = pred.eq(target.view(1, -1).expand_as(pred))

res = []
for k in topk:
correct_k = correct[:k].view(-1).float().sum(0, keepdim=True)
res.append(correct_k.mul_(100.0 / batch_size))
return res
``````

Doesn’t the “sorted” parameter in the topk function have to be set to False in order to preserve the ordering that way when we do pred.eq our comparison is valid?

Thanks for taking time to answer this question.

The “sorted” parameter doesn’t affect the ordering of input samples which are the rows of pred, but it sorts the columns of pred that represent indices of the topk labels in the order [ top1 top2 top3 …topk ].

#3

I’m a little confused as to the nature of this function’s output. Is it a list of accuracy values for each images tested or does it calculate some sort of mean of these values and output a singular value?

If it’s the former then could one achieve the latter by just returning `res.mean()` ?

Also can you just use `topk=(3)` for a top-3 accuracy for example, rather than `topk=(3,)`?

Thanks for any help on this.

(Enis Berk) #4

I spent a bit time trying to understand this function because of a wrong assumption, here is my explanation line by line for future reference:
This function calculates precision out of k classes, if you have 10 classes in your classification and k=6 then your classifier will have 50% precision@6 if it can predict 3 classes correctly.

``````# INPUTS: output have shape of [batch_size, category_count]
#    and target in the shape of [batch_size] * there is only one true class for each sample
# topk is tuple of classes to be included in the precision
# topk have to a tuple so if you are giving one number, do not forget the comma
def accuracy(output, target, topk=(1,)):
"""Computes the accuracy over the k top predictions for the specified values of k"""
#we do not need gradient calculation for those
#we will use biggest k, and calculate all precisions from 0 to k
maxk = max(topk)
batch_size = target.size(0)
#topk gives biggest maxk values on dimth dimension from output
#output was [batch_size, category_count], dim=1 so we will select biggest category scores for each batch
# input=maxk, so we will select maxk number of classes
#so result will be [batch_size,maxk]
#topk returns a tuple (values, indexes) of results
# we only need indexes(pred)
_, pred = output.topk(input=maxk, dim=1, largest=True, sorted=True)
# then we transpose pred to be in shape of [maxk, batch_size]
pred = pred.t()
#we flatten target and then expand target to be like pred
# target [batch_size] becomes [1,batch_size]
# target [1,batch_size] expands to be [maxk, batch_size] by repeating same correct class answer maxk times.
# when you compare pred (indexes) with expanded target, you get 'correct' matrix in the shape of  [maxk, batch_size] filled with 1 and 0 for correct and wrong class assignments
correct = pred.eq(target.view(1, -1).expand_as(pred))
""" correct=([[0, 0, 1,  ..., 0, 0, 0],
[1, 0, 0,  ..., 0, 0, 0],
[0, 0, 0,  ..., 1, 0, 0],
[0, 0, 0,  ..., 0, 0, 0],
[0, 1, 0,  ..., 0, 0, 0]], device='cuda:0', dtype=torch.uint8) """
res = []
# then we look for each k summing 1s in the correct matrix for first k element.
for k in topk:
correct_k = correct[:k].view(-1).float().sum(0, keepdim=True)
res.append(correct_k.mul_(100.0 / batch_size))
return res
``````

Correct me if you see any mistake.

(Gabir Yusuf) #5

thanks for your explanation. It’s so good and helpful

#6

Thanks for the code. It was really helpful with all those comments.
I’m not sure which pytorch version it was, but on using pytorch 1.0.0, I was getting an error on

``````_, pred = output.topk(input=maxk, dim=1, largest=True, sorted=True)
``````

So to make it work, I changed it to

``````_, pred = torch.topk(output, maxk, dim=1, largest=True, sorted=True)
``````