NaN in input tensors


I wonder how PyTorch deals with NaN-Values in the inputs? Are convolutions of NaN again NaN? And What is ReLU(NaN)?

Is there a recommended way to deal with NaN values (other then setting NaNs to a constant value e.g. zero)?


As far as I know the nan values will be propagated by mathematical operations.
ReLU seems to zero out the nans:

x = torch.randn(1, 3, 10, 10)
x[0, 0, 0, 0] = torch.log(torch.tensor([-1.]))

m = nn.Conv2d(3, 6, 3, 1, 1)
output = m(x)
print(output[0, 0, 0:3, 0:3])
> tensor([[    nan,     nan,  0.5057],
        [    nan,     nan, -0.4137],
        [-0.8564,  0.1068, -1.0790]])

r = nn.ReLU()
output = r(output)
print(output[0, 0, 0:3, 0:3])
> tensor([[ 0.0000,  0.0000,  0.5057],
        [ 0.0000,  0.0000,  0.0000],
        [ 0.0000,  0.1068,  0.0000]])

I would try to look for the origin of the nan values, because it seems to be wrong in most cases.

1 Like

Thanks a lot. Unfortunately they aren’t really avoidable, because they are part of my dataset.

Would you recommend setting them to zero or would you set it to values outside the usual data range like 1e6?

Do you have a dataset consisting of features, i.e. are the nan values indicate a missing feature?
If so, you could try to set them to the median value of this particular feature or to a categorical value indicating a missing feature, e.g. -1.

But I think it depends on the data you have.
Could you explain what kind of data you have and why the NaNs are there?

1 Like

The conv layers cant magically handle nan values using some kind of interpolation/semi-supervised learning/etc. So you need yourself to figure out how to convert the nan values into an actual concrete value that will make sense to the model you have written. As Patrick says, using the median/average value might be a quick way to get something working. Otherwise you can look at more complex techniques, which you could find by googling eg ‘handling missing data’, which throws up eg (I imagine a ton of Kaggle competition write-ups would give some insight into how to handle missing data ?)

I have the same issue, so maybe his topic is also related to mine. I am handling data sets that describe certain chemicals on the ocean surface. You can see this as a map with valid values on the sea surface and a mask (or nans) on the land surface.

Ideally, I would like to ignore nans by using the same padding techniques as at the image borders and completely remove full nan fields after the flattening. Hereby, it is worth mentioning that the nans are at the same position for all images and features. Any hints on how I can do this? :slight_smile:

NaN values are tricky to handle, as you cannot simply mask them e.g. by multiplying with zeros.
If you are using a mask for your targets I would recommend to use a specific class index (e.g. nb_classes+1) and try to deal with these values instead.

1 Like

in pytorch 1.7.1, relu does not seem to behave like this.

torch version: 1.7.1
tensor([[    nan,     nan, -0.2346],
        [    nan,     nan,  1.3086],
        [-0.0514, -0.6495, -0.5092]], grad_fn=<SliceBackward>)
tensor([[   nan,    nan, 0.0000],
        [   nan,    nan, 1.3086],
        [0.0000, 0.0000, 0.0000]], grad_fn=<SliceBackward>)


import random

import torch
import numpy as np

# repro.
seed = 0

print('torch version: {}'.format(torch.__version__))

x = torch.randn(1, 3, 10, 10)
x[0, 0, 0, 0] = torch.log(torch.tensor([-1.]))

m = torch.nn.Conv2d(3, 6, 3, 1, 1)
output = m(x)
print(output[0, 0, 0:3, 0:3])

r = torch.nn.ReLU()
output = r(output)
print(output[0, 0, 0:3, 0:3])

seems to have been changed in earlier version.

I also have dataset that has null values
But I don’t know how much to replace.
My data is null because they don’t exist in some situations.
What do you think I should do?
Many thanks

I don’t know which approach would work best and would refer to this post.

Thank you so much.
I realized I had to stop using Nan because the model couldn’t work with it. :pray::pray: