Data normalize between -1 to 1 and activation function

I have normalize the image data between -1 to 1 before giving it to CNN. CNN has Conv-> Instant Norm-> RelU operation.

As activation function is RelU, is it create problem as the negative output from Instant Norm will be clip to 0 while the input is between -1 to 1?

Why you don’t try another activation function?

In one CNN i referred, they used the Relu activation function and got the output. I tried to use some different CNN architecture using Relu but stuck here.

Can it possible to use ReLU when data normalize between -1 to 1? If not which will be better one?

Relu works well with CNN, classical sequence is CNN + BN + Relu stacked, and finally a full connected FC layer and an some activation function like sigmoid[0,1] to range that output in classification.

On the other hand, if your data is [-1,1] at the final output, you could turn it into probabilities using Softmax if you are working in classification.

What is about your loss function? Maybe it is not working well.

I think it should not be a problem until the very last activation of the network before classification, where you have to make sure the output is in the range you expect it to be (usually either [0,1] or [-1,1]). You can check that by looking at the output tensor’s .min and .max values.

Remember that the activation is there to introduce non-linearity in the network.

Hi joekid,
Its regression problem. I am working on GAN. Data is normalized between -1 to 1 before giving to 1st layer and output of CNN comes in denominator( and i think it should be between -1 to 1 as other data is in the same range) which is used in image restoration.

The question is ReLU will be fine for the data normalized between -1 to 1?? If yes then why it is useful as it clip the negative value which is loss of information.

As we need final output in the range -1 to 1, Is it fine if i use

(1) ReLU except last layer and LeakyReLU in last layer OR second option is
(2) Receive output between 0 to 1 from last layer and then convert it to range -1 to 1

1 Like

Hi victorc25,

If its regression problem where the output is also image then?

I’m not exactly understanding the question, but maybe some example could help with it.

Here you can see some code I added for tests with different output ranges. I was testing different loss functions that required either a [0, 1] range or a [-1, 1] range, so I changed that normalization via a parameter when loading the images elsewhere and these caps are also selected by a parameter.

You can leave your network uncapped at the end (no activation function) if you want to have the full range your network is producing, but images have limitations if you want to work with them as images and calculate the metrics, loss and all of that, so you need to take care of that yourself if you don’t cap it.

Does it help?