Possible bug when combining numpy function with Variable()

I am currently stuck at understanding the following behavior.

import numpy as np
import torch
from torch.autograd import Variable

1.0 + Variable(torch.ones(1)) 
# returns as expected
# Variable containing:
#  2
# [torch.FloatTensor of size 1]

np.sum(1.0) + Variable(torch.ones(1)) 
# returns an unexpected
# array([[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[Variable containing:
#  2
# [torch.FloatTensor of size 1]
# ]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]], dtype=object)

# Switching their order 
Variable(torch.ones(1)) + np.sum(1.0)
# returns the expected
# Variable containing:
#  2
# [torch.FloatTensor of size 1]

This behaviour is independent of np.sum and can be replicated with other numpy functions (e.g. np.exp, np.log, …).

I am relatively new to pytorch so it might be that I am missing something obvious that explains this. Am I, or is this really a bug?

Edit: Issue opened: https://github.com/pytorch/pytorch/issues/1294

This looks like a bug to me.
Can you open an issue in pytorch repo?

Thanks. I opened an issue in the repo.