The predicted training values are not similar with actual training values

I am new to the implementation of the neural network , I am trying to implement a Bayesian neural network where the my training dataset size is of 2k samples of 8D.
Here are the steps I have followed

  1. Have normalized the both x_train and y_train using the sklearn standard scaler.
  2. My neural network
    # Define the Bayesian neural network
    class BayesianNeuralNet(nn.Module):
  • def init(self,dropout_rate):*

  •    super(BayesianNeuralNet, self).__init__()*
    
  •    self.hidden1 = bnn.BayesLinear(prior_mu=0, prior_sigma=0.1, in_features=8, out_features=256) # Input layer to hidden layer*
    
  •    self.act1 = nn.LeakyReLU()*
    
  •    self.dropout1 = nn.Dropout(dropout_rate)*
    
  •    self.hidden2 = bnn.BayesLinear(prior_mu=0, prior_sigma=0.1, in_features=256, out_features=256) # Hidden layer to hidden layer*
    
  •    self.act2 = nn.LeakyReLU()*
    
  •    self.dropout2 = nn.Dropout(dropout_rate)*
    
  •    self.hidden3 = bnn.BayesLinear(prior_mu=0, prior_sigma=0.1, in_features=256, out_features=1)  # Hidden layer to output layer*
    
  •    *
    
  • def forward(self, x):*

  •    hidden1 = self.act1(self.hidden1(x))*
    
  •    hidden1_dropout = self.dropout1(hidden1)*
    
  •    hidden2 = self.act2(self.hidden2(hidden1_dropout))*
    
  •    hidden2_dropout = self.dropout2(hidden2)*
    
  •    output = self.hidden3(hidden2_dropout)*
    
  •    # output = torch.clamp(output,min=0.003)*
    
  •    output = torch.log(1 + torch.exp(output))*
    
  •    # output = torch.tanh(output)*
    
  •    return output*
    

3)Here is my model

Define the Bayesian neural network model

model = BayesianNeuralNet(dropout_rate=0.9)

loss_fn = nn.MSELoss()
KL_loss_fn = bnn.BKLLoss(reduction=‘mean’,last_layer_only=False)
KL_weight = 0.001

optimizer = torch.optim.Adam(model.parameters(), lr=1e-2)

Train the Bayesian neural network

num_epochs = 10000

for epoch in range(num_epochs):
for batch_x, batch_y in train_loader:

# Forward pass
outputs = model(batch_x)
loss = loss_fn(outputs, batch_y)
KL = KL_loss_fn(model)
cost = loss+KL_weight*KL

optimizer.zero_grad()  # Zero the gradients
cost.backward()        # Backward Propagation
optimizer.step()       # Update the model parameters

if (epoch+1) % 100 == 0:
  print(f'Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}')
  1. And my predicted min and max values are
    Max_val_train_data: 1.3269883
    Min_value_train_data: 0.83512664

where as expected is
Max_val_train_data: 1.3269
Min_value: 0.0688

Here my predicted min value is not being changed from Min_value_train_data: 0.83512664 .

Thank you for your responses in advance and suggest me where my model is wrong