Why can't one pass data through a torch ReLU module directly?

check my minimal example:

import torch
​
x = torch.randn(4,2)
y = torch.nn.ReLU(x)
print(x)
print(y)
tensor([[-0.5674, -2.1012],
        [-0.4920, -0.3471],
        [-0.5981,  0.8400],
        [ 1.7876, -0.1611]])
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-1-be685ec3279f> in <module>
      4 y = torch.nn.ReLU(x)
      5 print(x)
----> 6 print(y)

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py in __repr__(self)
   1034         # We treat the extra repr like the sub-module, one item per line
   1035         extra_lines = []
-> 1036         extra_repr = self.extra_repr()
   1037         # empty string will be split into list ['']
   1038         if extra_repr:

~/anaconda3/lib/python3.7/site-packages/torch/nn/modules/activation.py in extra_repr(self)
    100 
    101     def extra_repr(self):
--> 102         inplace_str = 'inplace' if self.inplace else ''
    103         return inplace_str
    104 

RuntimeError: bool value of Tensor with more than one value is ambiguous

is this expected?


related posts:

It is a module, not a function. Use torch.relu or torch.nn.functional.relu or tensor.relu() for functions.

seems a = torch.ReLU() then a(x) works.