Tensor.topk returns a class that doesn't exist

calling tensor.topk should return top_probabilites and top_classes.
This call is returning a class 0 which shouldn’t exist based on category mapping. What am I doing wrong?

def predict(image_path, model):

    tens_im = preproc(image_path)    
    with torch.no_grad():
        tens_im = torch.FloatTensor(tens_im)
        tens_im.unsqueeze_(0)

        probabilities = torch.exp(model.forward(tens_im.to(device)))    
        top_p, top_class = probabilities.topk(5, dim=1)
        top_p = top_p.to("cpu"); top_class = top_class.to("cpu");
        top_p = top_p.numpy()[0]; top_class = top_class.numpy()[0]
        
        return top_p, top_class

topk returns the k top values and their indices. Indices start at 0. So if you have N classes, their respective indices are 0, 1, 2, ..., N - 1.

By the way, instead of x.to("cpu") you can use x.cpu().

1 Like

So, while mapping the topk class indices with class category mapping, I need to increment the class indices by 1?

If you wish to have 1, 2, ..., N classes, yes! But keep in mind that they won’t correspond with the indices anymore.

1 Like