Neural Net Training all values in output vector converge to one value

Hi I am training my neural net model.
Loss function was built by likelihood function.
4hidden layers with 100 nodes. ReLU activation function.

output is vector of size 500.

As I train model more, the values in the output vector converges to same value.
The vector in [ ] is yhat. values in yhat consists of one value after running some epochs. but they should be different for each observation.

The outut that I have is
epoch 0, loss1 001.12 , loss2 034.93 , loss3 035.45, loss5 009.52 , totloss 081.01, acc 000.00
[1.1405597 1.107288 1.1093471 1.1373975 1.1396397 1.1277186 1.130907
1.1311299 1.1377169 1.1102827 1.1286004 1.1275425 1.1157365 1.1355797
1.1156945 1.1289982 1.126752 1.1433362 1.1223617]
epoch 1, loss1 001.10 , loss2 037.71 , loss3 040.34, loss5 011.70 , totloss 090.85, acc 000.05
[1.1225772 1.161662 1.1216304 1.1504776 1.1397775 1.1806581 1.1605873
1.143011 1.1311299 1.13676 1.1585436 1.1280268 1.1688657 1.1579617
1.1458558 1.1458558 1.1302841 1.1462775 1.1340436]
epoch 2, loss1 001.10 , loss2 036.14 , loss3 036.35, loss5 009.22 , totloss 082.82, acc 000.00
[1.1699617 1.1569937 1.1845496 1.1844468 1.152382 1.2059082 1.1704608
1.1789266 1.2139411 1.1711605 1.1610755 1.153051 1.1812706 1.1575742
1.1693635 1.1706606 1.18301 1.1964973 1.1716609]
epoch 3, loss1 001.10 , loss2 036.64 , loss3 036.14, loss5 008.84 , totloss 082.72, acc 000.05
[1.2007099 1.2341156 1.2004987 1.2547344 1.2282869 1.2278485 1.2440958
1.2400917 1.2177199 1.2093245 1.2228231 1.2310315 1.2310315 1.2358824
1.2203218 1.2440958 1.2018733 1.2330129 1.1950296]
epoch 4, loss1 001.09 , loss2 034.21 , loss3 037.43, loss5 010.75 , totloss 083.50, acc 000.32
[1.3193738 1.3362886 1.2875427 1.3245302 1.2740083 1.3582877 1.3386464
1.2861611 1.3156309 1.3644643 1.307931 1.2988677 1.3816202 1.3301677
1.3039752 1.360662 1.2940054 1.2914636 1.2712704]
epoch 5, loss1 001.10 , loss2 032.44 , loss3 034.00, loss5 008.73 , totloss 076.27, acc 000.42
[1.4496078 1.4650714 1.3861591 1.454679 1.4223737 1.4980042 1.4084361
1.4178045 1.3914201 1.4062761 1.3804265 1.4481595 1.4607198 1.4336894
1.3866371 1.4380276 1.4158816 1.4435745 1.3863981]

epoch 96, loss1 001.08 , loss2 019.22 , loss3 019.23, loss5 000.64 , totloss 040.18, acc 001.00
[4.0625 4.0585938 4.0703125 4.0585938 4.029297 4.0625 4.0664062
4.0664062 4.0566406 4.0703125 4.0664062 4.0507812 4.0546875 4.0585938
4.0664062 4.0664062 4.0585938 4.0664062 4.0566406]
epoch 97, loss1 001.09 , loss2 018.49 , loss3 018.53, loss5 000.70 , totloss 038.81, acc 001.00
[3.9902344 3.9726562 3.9902344 3.9902344 3.9882812 3.9960938 3.9921875
4.001953 3.9824219 3.9960938 3.9960938 4. 4. 3.9882812
3.9882812 3.9960938 3.9921875 3.9921875 3.9707031]
epoch 98, loss1 001.09 , loss2 017.47 , loss3 019.78, loss5 004.69 , totloss 043.03, acc 000.95
[4.0527344 4.0410156 4.0585938 4.0664062 4.048828 4.046875 4.0585938
4.0410156 4.0546875 4.060547 4.0546875 4.0625 4.048828 4.0625
4.0585938 4.064453 4.046875 4.0390625 4.060547 ]
epoch 99, loss1 001.11 , loss2 022.12 , loss3 019.87, loss5 004.69 , totloss 047.78, acc 000.95
[4.0234375 4.029297 4.0234375 4.0273438 4.0253906 4.029297 4.0273438
4.017578 4.0273438 4.0214844 4.0214844 4.0234375 4.03125 4.0273438
4.0273438 4.0253906 4.0234375 4.0195312 4.015625 ]
epoch 100, loss1 001.09 , loss2 019.24 , loss3 019.26, loss5 000.69 , totloss 040.28, acc 001.00
[3.9882812 3.9804688 3.9902344 3.9941406 3.9921875 3.9824219 3.9726562
3.9863281 3.9707031 3.984375 3.9785156 3.9765625 3.9785156 3.9824219
3.9746094 3.96875 3.9863281 3.9824219 3.9824219]

epoch 125, loss1 001.08 , loss2 019.14 , loss3 019.14, loss5 000.72 , totloss 040.07, acc 001.00
[3.9492188 3.9492188 3.9511719 3.9492188 3.9492188 3.9511719 3.9511719
3.9511719 3.9472656 3.9492188 3.9492188 3.9492188 3.9511719 3.9492188
3.9472656 3.9472656 3.9472656 3.9492188 3.9492188]
epoch 126, loss1 001.08 , loss2 018.86 , loss3 018.87, loss5 000.72 , totloss 039.53, acc 001.00
[3.9472656 3.9453125 3.9453125 3.9433594 3.9453125 3.9453125 3.9453125
3.9472656 3.9453125 3.9433594 3.9433594 3.9433594 3.9453125 3.9414062
3.9414062 3.9433594 3.9433594 3.9472656 3.9472656]
epoch 127, loss1 001.08 , loss2 019.23 , loss3 019.23, loss5 000.80 , totloss 040.33, acc 001.00
[3.8457031 3.8457031 3.8457031 3.8457031 3.8457031 3.8476562 3.8457031
3.8457031 3.84375 3.8457031 3.8457031 3.8476562 3.8457031 3.8457031
3.8457031 3.8457031 3.8457031 3.8457031 3.84375 ]
epoch 128, loss1 001.09 , loss2 019.86 , loss3 022.09, loss5 004.63 , totloss 047.67, acc 000.95
[3.8085938 3.8085938 3.8085938 3.8085938 3.8085938 3.8085938 3.8066406
3.8085938 3.8085938 3.8085938 3.8105469 3.8066406 3.8066406 3.8085938
3.8085938 3.8066406 3.8085938 3.8105469 3.8085938]
epoch 129, loss1 001.08 , loss2 019.39 , loss3 019.39, loss5 000.77 , totloss 040.63, acc 001.00
[3.875 3.875 3.875 3.875 3.875 3.875 3.875
3.875 3.875 3.8730469 3.875 3.875 3.875 3.875
3.875 3.8769531 3.875 3.875 3.8730469]
epoch 130, loss1 001.08 , loss2 018.86 , loss3 018.86, loss5 000.73 , totloss 039.53, acc 001.00
[3.9335938 3.9316406 3.9316406 3.9316406 3.9316406 3.9335938 3.9316406
3.9316406 3.9316406 3.9316406 3.9335938 3.9316406 3.9316406 3.9316406
3.9335938 3.9316406 3.9316406 3.9316406 3.9335938]
epoch 131, loss1 001.08 , loss2 018.85 , loss3 018.85, loss5 000.73 , totloss 039.51, acc 001.00
[3.9335938 3.9335938 3.9335938 3.9335938 3.9335938 3.9335938 3.9335938
3.9335938 3.9335938 3.9335938 3.9335938 3.9335938 3.9335938 3.9335938
3.9335938 3.9316406 3.9335938 3.9335938 3.9335938]

Does anyone have any idea about possible reasons?
The value itself is close to the real value though.