Unable to increase batch_size

I have a particular design for testing a model where I make use of torch.topk which returns me the top k predictions from the classifier in reverse order. Then based on certain “If” checks I want to use the prediction out of topk predictions for each testing example.

Problem -
When I do this for batch-size > 1, it uses the whole batch for that “if check”. I am not sure how can I modify the implementation for this task.

for batch_idx, (image, label) in enumerate(testloader):
    image, label = image.to(device), label.to(device)
    # --------------- Multi Class Classifier ---------------
    predict_A = classifier_A(image)
    # --------------- Get top k probabilities ---------------
    topk = torch.topk(predict_A.data, 2)
    first_pred       = topk.indices[:,0]
    second_pred = topk.indices[:,1]

    if (threshold < x):
        USE top 1 (prediction with highest probability)
    else ():
        USE top 2 (prediction with second highest probability)