My model only predicts values in a very narrow range

This is my model and its variables:

input_size_prot = 1024
input_size_comp = 196
hidden_size_prot = 32
hidden_size_all = 100
output_size = 1

batch_size = 40

class pcNet(nn.Module):

def __init__(self, input_size_prot, input_size_comp, hidden_size_prot, hidden_size_all, output_size):
    super(pcNet, self).__init__()
    self.fc_prot = nn.Linear(input_size_prot, hidden_size_prot)
    self.fc_all = nn.Linear(hidden_size_prot+input_size_comp, hidden_size_all)
    self.fc2 = nn.Linear(hidden_size_all, output_size)

def forward(self, x):
    out = F.leaky_relu(self.fc_prot(x[0]))
    out =, x[1]), dim=1)
    out = F.leaky_relu(self.fc_all(out))
    out = F.relu(self.fc2(out))
    return out

The labels I am trying to predict are all values between 5 and 11, however my model only predicts values between 5 and 7.5
The loss converges around 20. It is possible that my data (two sets of tensors) does not allow for better results, but perhaps you have ideas for my model.

Based on what you have shared, I need some more details to attempt to help you figure things out;

  • What type of loss function are you using?
  • Which optimization method are you using, and what are its hyperparameters?
  • How big is your training dataset?
  • How long are you training for, i.e. how many epochs?
  • Are you using any type of learning rate scheduling, early stopping, or similar?

loss and optimization:
criterion = torch.nn.MSELoss(reduction=‘sum’)
optimizer = optim.Adam(model.parameters(), lr=0.0001)

As you can see there are two inputs for data, one set has 375 elements the other 63, and I am checking all possible interactions so 23625 in total.
I am using a 80/20 split, so 18900 interactions for training and 4725 for testing.
This also means that all off the original 375 and 63 elements appear in both training and testing (it’s supposed to be like that).

Currently I am training for 5 epochs, but the model converges after two.

I am not using any type of learning rate scheduling