Why odes F.relu not able to process numpy arrays?

I did:

F.relu(np.ones(3))

but it threw an error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/brandomiranda/miniconda3/envs/hbf_env/lib/python3.6/site-packages/torch/nn/functional.py", line 463, in relu
    return _functions.thnn.Threshold.apply(input, 0, 0, inplace)
  File "/Users/brandomiranda/miniconda3/envs/hbf_env/lib/python3.6/site-packages/torch/nn/_functions/thnn/auto.py", line 126, in forward
    ctx._backend = type2backend[type(input)]
  File "/Users/brandomiranda/miniconda3/envs/hbf_env/lib/python3.6/site-packages/torch/_thnn/__init__.py", line 15, in __getitem__
    return self.backends[name].load()
KeyError: <class 'numpy.ndarray'>

why? do torch ops not process numpy stuff innately?

seems this issue also happens for torch.nn.ReLU

F.relu operates on a Variables containing a Tensor object and not on a numpy ndarray.

If you want to use torch Functions (such as F.relu), convert your numpy ndarray to a torch Tensor as:

np_arr = np.array([0,1,2,3])
tensor_arr = torch.from_numpy(np_arr)
tensor_var = Variable(tensor_arr)

F.relu(tensor_var)

yea but now I need to wrap F.relu and change types and check types etc seems unnecessary, but why would it need to be like this?

that’s how the library works. We dont process numpy arrays without conversion into torch Tensors. numpy arrays are not even defined for the GPU.