Variable logical operators bug?

Consider the following code:

from torch.autograd import Variable
t = torch.Tensor([2])
bool(torch.max(t) < 2)
Out[4]: 
False
bool(torch.max(t) < 3)
Out[5]: 
True

However, If you do the same with a Variable:

bool(torch.max(v) < 2)
Out[6]: 
True
bool(torch.max(v) < 3)
Out[7]: 
True

In the mean time I call v.data to overcome this.

It is because torch.max(v) returns another Variable containing a Tensor with a single element (so that you can use it and autograd will work as expected) while torch.max(t) returns a python number.
For conditions (that are not differentiable and thus you don’t want to keep a Variable) you can either do torch.max(v.data) or torch.max(v).data[0].

1 Like

Well that makes sense. Thanks.