# Softmax giving nans and negative values as output

Hi,

I am using softmax at the end of my model.

However after some training softmax is giving negative probability.In some situations I have encountered nans as probability as well.

one solution i found on searching is to use normalized softmax…however I can not find any pytorch imlpementaion for this.

Can someone please help to let know if there is a normalized softmax available or how to achieve this so that forward and backward propagations are smooth.

The focus of pytorch is not on data processing. All related issues must be resolved with the help of third-party toolkits. You can use the StandardScaler of `scikit-learn`. Of course, you can also learn about `skorch` (usually you don’t need this package, just `scikit-learn`).