Hey guys,

I’m currently working with word embeddings and I need to perform a pointwise division operations over 5 dimensional tensors and I’m not being very successful so far…

Maybe someone can tell me why this is not working:

```
def forward(self, x, mask):
boolean_mask = torch.from_numpy(np.not_equal(mask.data.numpy(), self.mask_value).astype('int32'))
count = boolean_mask.sum(self.axis)
count.squeeze_(self.axis)
count.unsqueeze_(len(count.size()))
total = x.sum(self.axis)
return total / count.float()
```

To give you a specific example:

`x.size(): (10, 5, 6, 3, 64)`

`mask: (10, 5, 6, 3)`

It goes well right before the return, where:

`total.size(): (10, 5, 3, 64)`

`count.size(): (10, 5, 3, 1)`

On one hand, if I don’t cast `count`

to a Variable the division operator I get this:

`assert not torch.is_tensor(other) AssertionError`

On the other hand, If I do `Variable(count)`

I get:

`RuntimeError: inconsistent tensor size at /Users/soumith/code/builder/wheel/pytorch-src/torch/lib/TH/generic/THTensorMath.c:869`

Is this a broadcasting problem, or something like that?