Here’s a reproduction using 0.1.10_2. This is not an in-place operation, so I don’t understand why it’s failing for the Variable and not for the Tensor.
import torch
from torch.autograd import Variable
x = torch.rand(1, 2, 3, 4)
# This succeeds.
x.sub(x.max())
v = Variable(x, volatile=True)
# This fails with "RuntimeError: inconsistent tensor size at /data/users/soumith/builder/wheel/pytorch-src/torch/lib/TH/generic/THTensorMath.c:827"
v.sub(v.max())
you could do: v.sub(v.max().data[0]).
We are thinking about introducing an autograd.Scalar type to better do this, rather than returning scalars in autograd as 1-dimensional Tensors with 1 element.