Incorrect loss in a fully-connected NN


I am working on a project to predict soccer player values from a set of inputs. The data consists of about 19,000 rows and 8 columns (7 columns for input and 1 column for the target) all of numerical values.

I am using a fully connected Neural Network for the prediction but the problem is the loss is not decreasing as it should.

The loss is very large (1e+13) and doesn’t decrease as it should, it just fluctuates.

This is the code I am using to run the model:

The model is fully connected neural network with 4 hidden layers, each with 7 neurons. input layer has 7 neurons and output has 1. I am using MSE for loss function. I tried changing the learning rate but it is still bad.

What could be the reason behind this?
Thank you!

Could you check the logits range and compare it to the outputs? Since the loss is extremely large I would guess that outputs (i.e. your targets) might also contain large values? If so, try to normalize them during training and un-normalize the model output during validation to get the expected value ranges again.