ReLU turns nan into zeros

As of pytorch 4.1 this is not the case anymore. relu(NaN) == NaN

In [1]: import torch

In [2]: x = torch.ones(1).float()+float('NaN')

In [3]: x
Out[3]: tensor([    nan])

In [4]: x.relu()
Out[4]: tensor([    nan])

I’ve previously happily (ab)used the previous behaviour. Is there a suggested new method on How to set ‘nan’ in Tensor to 0?

I assume the suggested method my_tensor[torch.isnan(my_tensor)] = 0. will cause problems for GPU and have a high memory cost. Is there some other method?