Hi
There seems to be a bug where relu() does not work for 2d tensors of type long.
Relu works as intended for 1d tensors but for 2d tensors, it seems to only work for really small tensors.
It might be me who is doing something weird here but for me at least it seems like a bug.
Code to replicate:
import torch
s = torch.relu(torch.LongTensor(6).random_(-5, 5))
print(s.min(), s.max()) # => tensor(0) tensor(4)
a = torch.relu(torch.LongTensor(2,2).random_(-5, 5))
print(a.min(), a.max()) # => tensor(0) tensor(3)
b = torch.relu(torch.LongTensor(10,10).random_(-5, 5))
print(b.min(), b.max()) # => tensor(-5) tensor(4)
print(b)
"""
tensor([[ 0, -4, -2, 0, 0, 0, 0, 0, -5, 0],
[ 0, 0, 0, -2, -2, -1, -4, -5, -4, 0],
[ 0, -2, -2, 0, 0, -2, -3, -3, -5, 0],
[-5, 0, -3, -2, 0, 0, -4, 0, -5, 0],
[-3, -4, -5, -1, 0, -1, 0, 0, -2, 0],
[-4, 0, 0, -4, 0, 0, -5, -1, -1, 0],
[ 0, 0, -4, -2, -1, 0, -2, 0, -2, 0],
[-3, 0, 0, -4, 0, 0, 0, 0, -4, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, -5],
[ 0, 0, 0, 0, 0, 0, 0, 4, 2, 2]])
"""