Function topk from torch returns out of range indices for probability values

Hi everyone, I have an issue with getting the right indices for predicted classes of the probabilities returned by topk function of pytorch.

My output classes are just 2 (cancerous and noncancerous) and the directories are labeled as (0 for cancerous and 1 for noncancerous).

Unfortunately, after getting my output from model.forward(), applying the softmax function and applying the topk function, I get reasonable probalities but the indices tend to be [436, 600] (out of range) instead of [0,1] which represents my class directories.

I have tried a number way to detect the source of these large index values, but couldn’t have an answer. Help on this would be appreciated.

The screenshop below is a typical example of what I am talking about, just that mine is only 2 classes.

Did you change your network’s final layer to only have a two-dimensional output?
Without evidence to the contrary (i.e. the tensor you input to topk and the result) I would venture that it is more likely that something is unexpected about what you feed into topk than with topk itself.

Best regards


Thanks for you response Tom,
Yes, I load a pretrained vgg13 model and add a classifier at the end with an output layer of just 2 nodes as below:

Next, the topk receives the resulting output probabilities and returns the best 2 probabilities with their indices in a sorted order. Just that the returned indices tend to be out of range. Here is the sample snapshot:

If you have batches, do you need to pass dim to topk?
If that doesn’t fix it, can you print probs, please?

Best regards


@Tsakunelson I have meet the same problem. My pytorch version is 0.4.1. Did you have sovle this problem? Thanks a lot.