Inference return same tensor with really hight energy. Why?

After Training when I test my model I get the same tensor over and over?
Why?
Is it normal that the values are so big???

tensor([[ 1.2056e+16,  5.9767e+15,  2.1085e+15,  5.2158e+16,  7.4808e+15,
      1.6452e+15,  1.6628e+16,  1.8721e+16,  9.9798e+14,  2.4065e+15,
      1.0812e+16,  1.3105e+16,  7.6940e+14,  9.8767e+14,  7.3642e+15,
      4.2856e+15,  1.0966e+16,  1.7450e+16,  2.2951e+15,  1.4780e+16,
     -7.3730e+14,  3.9957e+15,  7.9150e+15,  8.4011e+15,  2.0821e+15,
      1.1922e+15,  1.4118e+16,  5.7647e+15,  7.5982e+15,  5.0951e+15,
      7.6691e+15,  8.4151e+15,  1.2151e+16,  1.0469e+16,  1.8380e+15,
      7.6098e+15, -3.2812e+15,  1.9539e+15,  6.2806e+15, -5.4205e+15,
      5.5007e+15, -3.2318e+15,  7.6758e+15, -7.1428e+14,  4.4580e+15,
      2.5704e+15,  3.0881e+15, -2.6194e+15,  8.4862e+15, -1.2936e+15,
      3.7698e+15,  1.9810e+15,  3.3016e+15,  1.6953e+15,  1.7504e+15,
      2.4143e+15,  2.5783e+15,  8.6065e+15,  8.2097e+14,  2.9864e+15,
      1.7083e+15, -1.1889e+15, -4.0307e+15, -3.3194e+15, -4.6815e+15,
     -3.3368e+15, -3.9685e+15, -1.7121e+15,  7.3881e+14,  2.4879e+15,
      3.2985e+15,  1.0613e+15, -1.7420e+15,  6.6461e+15, -9.2257e+14,
     -4.3156e+15,  7.8808e+15,  5.7687e+15,  4.0117e+15,  5.5298e+13,
      9.6740e+15,  5.1829e+15,  4.5807e+15,  2.5647e+15, -2.6201e+15,
      4.1714e+15, -9.2970e+13,  2.3725e+15,  4.1076e+15,  5.0096e+13,
     -3.7963e+14, -1.0503e+15,  8.4002e+14,  2.0620e+15,  9.3061e+14,
     -1.8499e+15, -2.4000e+15,  2.9994e+14, -5.4544e+15, -1.5808e+15,
      1.2891e+15,  2.4871e+15,  7.0153e+14,  1.9218e+15, -5.3665e+15,
     -4.9205e+15, -4.5869e+15, -6.6558e+15, -6.4367e+15, -6.5210e+15,
     -5.2257e+15,  2.0981e+15, -3.0363e+15, -6.1868e+14, -5.8302e+15,
     -2.0618e+15, -2.4618e+15,  2.6335e+15,  1.5778e+15, -6.7361e+15,
      2.0971e+15,  6.1178e+14, -1.8537e+15,  4.6063e+14,  3.4269e+15,
      3.4286e+15,  1.2131e+15, -2.4635e+15, -1.9550e+15, -3.6220e+15,
     -3.0985e+15, -8.2022e+14, -2.6846e+15, -2.6584e+15,  1.6548e+15,
      3.9198e+15,  2.0738e+15,  1.4412e+15,  3.0269e+15, -2.3843e+15,
     -9.4392e+14, -3.8304e+15, -1.0557e+15, -2.4536e+15, -4.0803e+15,
     -2.7027e+15, -1.0560e+15, -2.0054e+15, -3.5940e+15, -3.6986e+14,
      5.0988e+14, -3.1614e+15, -1.4806e+15, -1.6534e+15, -1.6811e+15,
     -4.0643e+15, -2.4029e+15, -3.7630e+15, -2.1271e+15, -2.5981e+15,
     -2.5673e+15, -2.1057e+15, -1.3371e+15,  7.5890e+14,  4.1515e+15,
     -2.7834e+15, -2.4481e+15,  9.5509e+14, -2.6114e+15,  5.6024e+14,
     -1.8457e+15, -2.0567e+15, -2.4415e+15, -7.2374e+14, -7.0101e+14,
     -3.8846e+15, -3.2767e+14,  8.6820e+14,  4.5376e+14, -2.8405e+15,
     -5.7237e+15, -2.9263e+15, -6.2191e+15, -5.3570e+15, -7.5934e+15,
     -5.7062e+15, -6.6455e+15, -5.6229e+15, -1.4619e+15, -3.0077e+15,
     -1.0732e+15, -3.8536e+15, -2.5924e+15, -3.5577e+15, -4.2650e+15,
     -2.9791e+15, -5.9215e+15, -4.2179e+15]], device='cuda:0')