Division between variable and tensor raise assertion error

Hey,

I’m trying to do something like
c = torch.div(a, b) # <=> c = a.div(b)
with:

  • type(a) = torch.autograd.variable.Variable
  • type(b) = torch.FloatTensor

Using Pytorch 0.2.0+38b42e0 this raises an AssertionError (assert not torch.is_tensor(other)) in torch/autograd/variable.py, line 353.

I do understand what does it mean, and in fact, here, other (I called b) is a tensor. But I don’t see why this is forbidden.

I guess I should wrap into a Variable object but tbh it does not really makes sense to me (or maybe should it be wrapped automatically instead of raising error?)

Hi,

as a rule, Variables only interact with Variables, tensors with tensors.
(But quite possibly it might be nice to give a message…)
I think automatic wrapping is something that people think about now and then, but then it seems that it is not something is a lot of bother to most people.
This is in contrast to the automatic broadcasting, which to me really improved the experience of coding things up in pytorch (between 0.2 and 0.1.12)…

Best regards

Thomas

hmm it makes sense.

It still confuses me when I should use Variable or Tensor. I’m talking about requires_grad=False Variable specifically (otherwise ofc it needs to be a Variable).

Just wrap all tensors with Variable. It’s still a tensor but now just gives you option to use the functionality of Variable. Yeah the auto wrapping I saw in works but not implemented yet, think the holdup is over how it would make a much more high level view and cause people some ambiguity