Different results of Variable and Tensor

Hi,
I’m new to PyTorch, and the following code snippet is confusing to me. The comment line and the line after it give two different results. Can anyone figures it out?

def test(model, data_loader, threshold=0.5):
    correct = 0
    total = 0
    for idx, batch in enumerate(data_loader):
        image, target = [Variable(_, volatile=True).cuda() for _ in batch]
        output = model(image)
        preds = output.gt(threshold).type_as(target)
        #correct += th.eq(preds, target).sum().data[0]
        correct += th.eq(preds, target).data.sum()
        total += target.numel()

    return correct / total

For example, in PDB interpreter:
th.eq(preds, target).sum().dat[0] returns 117, while th.eq(preds, target).data.sum() returns 319449

Environtment:
Anaconda python 3.6 + PyTorch 0.3.1 + Ubuntu 14.04 + CUDA 8

There could be an overflow, since torch.eq returns a torch.uint8 tensor.
You could convert it to torch.long before calling sum.

I think this bug is fixed in the latest stable release though.
You can find the install instructions in the website.

Cool~Converting it to torch.long fixes the problem. Thanks :grin:.