Binary Classification Machine Learning Low Accuracy On Certain Datasets

I’m currently working on trying to get a program that takes in txt file with 300,000 lines, each of which has an input of 12 decimal numbers and a singular value 0 or 1 at the end, to hopefully be able to guess a different set’s value of either 0 or 1 based off 12 decimal numbers in front of it, but first I’d like the program to take a subsection of the train set and use it as a validation set so I could get a rough idea on what the accuracy is, which I’ve completed below in Pytorch.

I’ve been following a set of very useful tutorials online: (https://www.youtube.com/watch?v=OGpQxIkR4ao&ab_channel=PythonEngineer) and (https://www.youtube.com/watch?v=PXOzkkB5eH0&ab_channel=PythonEngineer)

for which while using the sample online set seemed to work perfectly providing an accuracy of 0.9+ but when I tried to use a local dataset on the program its accuracy seems to stay at roughly 0.5 which isn’t much better than random guessing (I go into further detail on this below the main code and main output). Changing properties like the epochs and learning rate does seem to have a minor effect on the accuracy so I’m assuming it’s not just guessing either completely 0’s or 1’s but the accuracy won’t change much more than 4% even with a large number of epochs and a very low learning rate.

So if anyone has any advice on what I could be doing wrong with this program it’d be greatly appricated, I think it’s a small mistake that I just can’t see as a similar version of the code was used on a seperate online data set (with a different data structure) and it did really well. Here is the code, output and a sample set of inputs attached:

Python Code:

#Setup
import torch
import torch.nn as nn
import torchvision
import numpy as np
import math

from sklearn import datasets
from torch.utils.data import Dataset, DataLoader
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split

class BoneDataset(Dataset):
    def __init__(self):
        xy = np.loadtxt('train-io.txt', delimiter=" ", dtype=np.float32, skiprows=0)
        self.x = torch.from_numpy(xy[:, 0:12])
        self.y = torch.from_numpy(xy[:, 12])
        self.y = self.y.int()
        self.xyn_samples, self.xyn_features = xy.shape
        self.n_samples, self.n_features = self.x.shape
        
    def __getitem__(self,index):
        return self.x[index], self.y[index]
    
    def __len__(self):
        return self.n_samples

dataset = BoneDataset()

xt = dataset.x
yt = dataset.y
x = np.array(xt)
y = np.array(yt)
n_samples = dataset.n_samples
n_features = dataset.n_features

#bc = datasets.load_breast_cancer()
#x = bc.data
#y = bc.target
#n_samples, n_features = x.shape

print(x)
print(y)
print(x.shape)
print(y.shape)
print(type(x))
print(type(y))
print(n_samples, n_features)

x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=1234)

scale = StandardScaler()
x_train = scale.fit_transform(x_train)
x_test = scale.transform(x_test)

x_train = torch.from_numpy(x_train.astype(np.float32))
x_test = torch.from_numpy(x_test.astype(np.float32))
y_train = torch.from_numpy(y_train.astype(np.float32))
y_test = torch.from_numpy(y_test.astype(np.float32))

y_train = y_train.view(y_train.shape[0], 1)
y_test = y_test.view(y_test.shape[0], 1)

#Model
class LogisticRegression(nn.Module):
    def __init__(self, n_input_features):
        super(LogisticRegression, self).__init__()
        self.linear = nn.Linear(n_input_features, 1)
    
    def forward(self, x):
        y_predicted = torch.sigmoid(self.linear(x))
        return y_predicted
    
model = LogisticRegression(n_features)

#Loss And Optimizer
num_epochs = 100
learning_rate = 0.01
criterion = nn.BCELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate)

#Training Loop
for epoch in range(num_epochs):
    y_predicted = model(x_train)
    loss = criterion(y_predicted, y_train)
    
    loss.backward()
    optimizer.step()
    optimizer.zero_grad()
    
    if(epoch+1) % 10 == 0:
        print(f'epoch: {epoch+1}, loss = {loss.item():.4f}')

with torch.no_grad():
    y_pred = model(x_test)
    y_pred_cls= y_pred.round()
    acc = y_pred_cls.eq(y_test).sum() / float(y_test.shape[0])
    print(f'accuracy: {acc.item():.4f}')

Output:

epoch: 10, loss = 0.7625
epoch: 20, loss = 0.7567
epoch: 30, loss = 0.7514
epoch: 40, loss = 0.7464
epoch: 50, loss = 0.7419
epoch: 60, loss = 0.7377
epoch: 70, loss = 0.7339
epoch: 80, loss = 0.7304
epoch: 90, loss = 0.7272
epoch: 100, loss = 0.7242
accuracy: 0.5127

If I use an online sample set of data I get an accuracy of 0.9+ which is really good, the only change that needs to be made to make use the online sample set is to uncomment the following lines of code:

#bc = datasets.load_breast_cancer()
#x = bc.data
#y = bc.target
#n_samples, n_features = x.shape

It is really strange that I’m getting such a low accuracy with the local dataset I’m trying use as with the prints statements below the sample online set and the local dataset are near identical except in size so I don’t understand why one can get an accuracy of 0.9+ while the other gets an accuracy of only 0.5ish.

print(x)
print(y)
print(x.shape)
print(y.shape)
print(type(x))
print(type(y))

Using The Locally Avaialable Dataset

[[ 1.8304332   1.4942976  -2.1410794  ...  1.3968126  -1.5442173
   1.6766251 ]
 [-1.4084567  -2.5564973   2.6160343  ... -0.58872694  4.597042
  -0.1104113 ]
 [ 1.7604579   4.608794   -0.14257275 ...  2.1461182   3.0524266
   0.9494277 ]
 ...
 [ 0.26332846 -0.31079966  1.830318   ... -2.2811062  -2.1526384
  -1.9504634 ]
 [-3.0524952  -0.38578448  5.5219817  ... -0.91297805  1.2235076
  -0.7969331 ]
 [ 0.9841667   0.6342935   0.03264663 ...  1.0182375  -0.11988503
   1.1116314 ]]
[0 1 1 ... 1 0 1]
(300000, 12)
(300000,)
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>

Using the online sample dataset (uncommented)

[[1.799e+01 1.038e+01 1.228e+02 ... 2.654e-01 4.601e-01 1.189e-01]
 [2.057e+01 1.777e+01 1.329e+02 ... 1.860e-01 2.750e-01 8.902e-02]
 [1.969e+01 2.125e+01 1.300e+02 ... 2.430e-01 3.613e-01 8.758e-02]
 ...
 [1.660e+01 2.808e+01 1.083e+02 ... 1.418e-01 2.218e-01 7.820e-02]
 [2.060e+01 2.933e+01 1.401e+02 ... 2.650e-01 4.087e-01 1.240e-01]
 [7.760e+00 2.454e+01 4.792e+01 ... 0.000e+00 2.871e-01 7.039e-02]]
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
 1 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 0 0 1 0 0 1 1 1 1 0 1 0 0 1 1 1 1 0 1 0 0
 1 0 1 0 0 1 1 1 0 0 1 0 0 0 1 1 1 0 1 1 0 0 1 1 1 0 0 1 1 1 1 0 1 1 0 1 1
 1 1 1 1 1 1 0 0 0 1 0 0 1 1 1 0 0 1 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 1 1 0 1
 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 1 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 1 0 0 0 1 0
 1 0 1 1 1 0 1 1 0 0 1 0 0 0 0 1 0 0 0 1 0 1 0 1 1 0 1 0 0 0 0 1 1 0 0 1 1
 1 0 1 1 1 1 1 0 0 1 1 0 1 1 0 0 1 0 1 1 1 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0
 0 0 0 0 0 0 0 1 1 1 1 1 1 0 1 0 1 1 0 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1
 1 0 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 0 1 1 1 1 0 0 0 1 1
 1 1 0 1 0 1 0 1 1 1 0 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0
 0 1 0 0 1 1 1 1 1 0 1 1 1 1 1 0 1 1 1 0 1 1 0 0 1 1 1 1 1 1 0 1 1 1 1 1 1
 1 0 1 1 1 1 1 0 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 0 1 1 1 1 1 0 1 1
 0 1 0 1 1 0 1 0 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 0 1
 1 1 1 1 1 1 0 1 0 1 1 0 1 1 1 1 1 0 0 1 0 1 0 1 1 1 1 1 0 1 1 0 1 0 1 0 0
 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
 1 1 1 1 1 1 1 0 0 0 0 0 0 1]
(569, 30)
(569,)
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>

Sample Set Of Inputs Used From Intended Local Dataset (150/300,000) (Needs to be saved as ‘train-io.txt’ in the same directory as the Python file to be used)

-1.99045217 -1.40607965 2.71678582 -3.70629924 -3.14337061e-01 -1.54770974e-01 3.50844296 -1.08004104 -2.43384485e-01 6.89013782e-01 -1.67605624 5.01576543e-02 1
-3.01029538e-01 -4.72911318 6.01167146e-01 1.62611969e-01 3.62557887 -1.45076077 6.01137263e-01 -2.24355449 1.63722894 4.25577822e-02 1.13424037 4.34732584 0
1.09885188 -2.39236136 -3.28692724 1.77307004 1.40922587 -1.39747528e-01 9.35373268e-01 1.54748173 -6.57541866e-01 1.72365529e-01 1.41654990 2.23833608 1
1.17889801 -3.32005518 4.90683895e-01 2.56304614 2.67164645e-01 -3.80436527e-01 -3.02129248 -3.44614518 -2.58423006e-01 -1.73783199 7.27725203 6.92246822e-01 0
8.22938016e-01 3.94808122 -4.21515016 3.33085884e-01 -8.06812221e-01 -1.08405406e-02 -2.51833132 9.71709856e-01 -5.58478894e-01 3.95492213 -8.31784276e-01 4.04523014 1
1.75743713 2.84824302 -1.27702129e-01 -2.50509786 -3.88282246 8.14555801e-01 -1.79830736 1.55416768 2.12638577 4.82215517e-01 -1.18923070 9.02767768e-03 1
-3.06138028 -1.54101035e-01 3.11603730 9.43639689e-01 -1.32776018 3.42432862 1.43936324 -2.12182357 -1.01854679 8.36730188e-01 -5.06089908e-01 -2.07921534 0
2.30487790e-01 7.45982442e-01 -6.78751843e-01 -3.23793317e-01 -3.84737068e-01 9.89033966e-01 -1.49970017 1.63012449 -3.65336996e-01 -2.55591687 1.14968751 -7.48640101e-01 0
-4.82435234e-01 -3.68697214 -2.76750274 -3.84704447e-01 1.96097197 -4.02393873e-01 -1.48450479 -5.34861155e-01 -3.97099775 2.22917894 1.07527652e-01 4.34356890 1
1.09260716 -1.97936942 -3.53048534 9.98976929e-01 3.82695544 1.00563733 1.20487463 6.40798068e-02 3.80468181 3.46850632e-01 -6.66443772e-01 4.10101038 1
-2.77987906e-01 4.30866140e-01 3.40009598 1.50542872 -3.57716511 -2.76879409e-01 -8.25889431e-01 -4.29255353 1.00709224 4.93104869e-01 4.92572898 -2.13072667 1
8.74514415e-01 1.01267371 -2.37627404 -5.38215825e-01 -5.57225987e-01 -4.36341929e-01 -1.81164583 2.11915130 -1.17345865 2.48300038e-02 1.25571411 7.39795873e-01 0
-2.00280376 6.15709367e-01 -1.06682049 -1.73539177 1.15651584 -1.23573041 1.69430038e-01 2.28671814 -3.93934986 1.03060456 -5.69093574 -2.57840979 0
2.60617625e-01 3.15456270 2.92078929e-01 2.81195292 -2.61769193 1.78990718 -1.04412304 -5.73573922e-01 -1.05850956 1.20736263 2.83823697 -1.51473802 1
2.60789031e-01 9.74821202e-01 -2.04432954 9.98022533e-01 -2.22635565 3.07252556 -1.06536161 2.43968291 -1.92167364 1.91222555 -3.27332253 -6.17255207e-01 0
-2.46173532 -2.01571712 1.94405602 -3.14753385e-01 3.54422107 -1.01141623 7.92302803e-01 -1.97893915e-01 1.03894716e-01 -1.93666192 -1.34038508 -1.65052710 1
-1.95861778 -1.47651248 1.84973259 -2.38589385 2.63538478e-01 9.52324226e-01 -3.15503670e-01 -8.59884510e-01 -2.28195158 -7.17481185e-01 -2.22101331e-01 -9.83930977e-01 0
-1.01248647 1.79986720 -7.71369179e-01 -1.12281733e-01 -1.48920994 1.05928022 -1.20796809 -5.28110296e-01 -4.05617508 1.91730940 7.42554578e-01 -1.37507818 1
2.01225466 -1.38009260 6.95320018e-01 1.97844237 3.08713581 -3.62956436 -8.72124080e-01 -3.32340283e-01 1.98508691 -3.65525512 1.56040796 -2.23652758 0
3.00102525 -3.43334802 -6.17977589e-01 1.33968151 6.11488821e-01 4.46359974e-01 -1.62181215 7.69268199e-01 2.92582015 -8.20197601e-01 1.16980342 3.61828347 1
-1.24974354 -1.60724288 2.16492041 -2.39537128 3.56895868 -3.61436089 1.99851472 -2.84486815 3.64098411 -1.38603068e-01 9.49372521e-02 2.00912416 0
6.60007069e-01 -2.34196630 8.43437251e-01 1.06902166 2.15084932e-02 -1.62142382 5.94795228e-01 -1.85403338 1.32944650 -7.85104052e-01 5.98350396 1.11672724 1
-8.13673247e-01 -3.35189434 2.30034603 2.55960419 7.19306203e-01 -7.16594033e-01 6.74980779e-01 -1.56094406 -3.94020523e-01 -2.19806019e-01 2.91600364 -1.38456581e-01 1
2.88625846e-01 1.95024766 -2.80730507 -5.04139795e-01 -1.30620202 3.34701828e-01 1.21924626 6.54270953e-01 -1.31668027 4.15068718 1.11628232 4.60273037 1
1.42240617 -1.52786409 1.94582838 4.15441726e-01 -2.45614436e-01 -2.49177143 -2.24779719 -2.10736413 4.62016747e-01 3.77047389e-01 2.70047198 1.10223965e-01 1
4.97250924e-02 1.84492469 -2.09121308e-01 2.79241059e-01 -1.27565380 8.76631371e-01 1.38104035e-01 1.93882566 1.25010393 1.77516266e-01 -1.18198474 4.28310892e-01 1
2.43787551e-01 -8.77814582e-01 8.44480348e-01 4.53061837 -9.79232303e-01 1.16220415 -1.01839850 -1.54194710 7.89990530e-02 -3.27166244 4.42766359 -4.82145758 1
-5.90232641e-01 -2.68044578 3.73950509e-01 1.65193898 2.47471044 -6.56377707e-01 1.63348829 -1.70227839 -3.74047732e-01 -4.76014487e-01 4.01968901 1.53975718e-01 1
-4.60462291e-01 -6.22415284e-01 1.19632802 2.34732443 -1.66215026 1.10052807 -6.27810799e-01 -2.06104099 -7.32079579e-01 -3.19731536e-02 1.99308086 -2.71129226 0
2.93026821e-01 -1.27577085 2.44644987 -3.54946646 -4.87462606e-02 -1.37029122 1.89517775e-01 2.51677138 1.05652047 -3.65718360 -3.12442272 -3.67560924 1
-1.28143493 -1.03075991 8.18856150e-02 -1.09452237 1.62149019e-01 -1.09857845e-01 3.11924251 -4.50234039e-01 -1.75974663 -1.67415043e-01 1.32144580e-01 -5.72724237e-01 0
-2.63031927e-01 7.61118748e-01 -6.87239876e-01 1.14774710 -2.08398106 2.26786864 4.08815365e-01 -1.32038313 2.23316330e-01 2.86616349 4.17287920 3.63608562 0
6.63113250e-01 -3.96018623 -5.73021296e-01 2.29293385 4.01986173e-02 2.43747203 -2.89111164 -1.24720297 -1.46618565 -3.36573126e-02 1.37912438 2.05394006 1
1.06798486e-01 -2.77414142 1.96625812 -4.30559691 1.86741799 1.18880358e-01 1.31952116 3.04984523 1.56535606 -3.39020590 -4.70678287 7.92359002e-02 0
1.87383472 -3.91166543 -1.09488410 -7.26581039e-01 -9.13651777e-03 -1.08976009 -1.42488400 3.26690407e-01 2.47642054e-01 -2.23164831 2.76553986 -4.60747305e-01 1
3.33523273e-01 -4.12810109e-01 -3.40122132 -3.94550550e-01 2.93208458 -1.87029798 2.72906647e-01 1.13199527 5.91035326e-01 2.24309786 -5.59897087e-01 5.66058945 1
1.72231444 3.71991324 -3.91546557 -2.26934462e-01 -8.29375218e-01 -9.42668171e-01 -1.61990027 8.13791255e-01 -2.31212947 2.94626031 -5.37643814e-01 5.15769167e-01 0
-1.64656362 -1.10954738 2.39813734 1.96249452 1.66441076e-01 8.66178206e-01 2.05049544 -1.23216013 -1.43094971 -7.44178763e-01 9.48777857e-01 -2.02534343 1
1.73855348 -5.53782190 -6.48851617e-01 -7.91664531e-01 2.13814094 -1.80545414e-01 1.78194252e-01 -6.43020613e-01 4.31186541 -3.66270956 2.82157804 2.71136265 1
-8.87566251e-01 -1.77773574 1.34194811 6.81862863e-01 2.23841597e-01 3.04617433e-01 5.77932560e-01 1.06971578 -7.69409560e-01 -2.91748424e-01 -2.70308382 -1.95605509 0
-4.20572091e-01 3.99217311 -7.29230953e-01 1.47574009 -1.22779197 -5.79486683e-01 1.28798133 -1.35672449 -5.91131009e-01 1.84120357 1.11045316 -2.58050900 0
-1.62840094e-02 1.74771115 -7.52295707e-01 2.62941500 -9.16626077e-01 -1.21722898 1.78116381 -1.58876586 -1.19170240 7.14179141e-02 3.66111231 -3.09918691 0
-1.81674620 1.12419300 1.80180619 1.21744150 -2.26994272 2.24806159 1.00406612 9.53202960e-02 -1.50774030 -1.13149569 2.28003777 -2.56460453 0
3.41775319e-01 3.13306537e-01 -1.85592820 -3.03912856 -4.62861452e-01 -1.68978602 -2.51214948 8.18191403e-01 -1.94236187 4.91682185e-01 -1.70550469 9.96096189e-01 0
9.97665809e-01 -2.55123055 -2.36339827 -2.43837174 3.67386376 -6.64247180e-01 1.96878439 1.41325975 2.10175154 -2.15540633e-01 -3.46505371 3.08260778 1
7.41206075e-01 5.25938253e-02 -1.63407505 -2.94868685 1.68701321 -3.18876113e-01 1.66284488 2.27586665 3.31022383 -1.89022837 -2.11058711 2.93154892e-01 1
-1.45125874 -2.87047212e-01 -3.95522232e-01 2.82596238 -6.62968333e-01 7.74603777e-01 -2.41366456 -8.40496347e-01 -4.24148741 2.30185734 1.87320437e-01 9.21871648e-01 1
4.80180980e-01 -3.22391839 3.02077702 2.11114219e-01 1.22601595e-03 2.07767955e-01 1.24429967 -4.35541181 2.60551345 8.89125744e-01 4.98055440 2.54253354 1
-1.00008123e-01 -1.58087204 4.69136798e-01 1.30944137 8.36538385e-01 -9.23272536e-02 2.59016974e-01 2.51590232e-01 -1.59529284 -2.10540651e-01 -1.54982679 -1.59202729 1
1.33005590 -8.90834865e-01 -6.07503224e-01 -6.88872087e-01 -2.86428109 5.56980638e-01 -4.07025088 2.44317241 -2.41680442 -3.27593333e-01 -1.47266564 -1.01572028 0
2.47064813 -1.98032429 1.94198681 -1.30914558 1.73696288 -1.70342589 1.61138266e-01 -6.82397038e-01 5.71809726 -4.05905752 2.69289294 5.90212774e-01 1
3.13886569e-01 -1.86690128 1.54766352 -6.48078985e-01 1.36924175 -5.55395241e-01 -1.74338288 1.51646874 7.32823137e-01 -1.81922576 -2.51211499 7.78449209e-01 1
-9.17662863e-01 -6.92375777e-01 1.63559783 4.19548782e-01 9.19659494e-01 -9.67884041e-02 6.85198449e-01 -3.67690610 1.88990301 7.51537502e-01 2.36336409 3.63748264e-01 0
-9.80011156e-01 3.38744850e-01 9.21666550e-01 2.10984707 -1.51896097 2.51128786 -1.30601822 -3.25038508e-02 3.50474264e-01 -1.28485647 9.77570988e-01 -1.20937358 1
1.34161949 -1.22712754 1.39019532e-01 -2.11440927e-01 -1.15163276 1.69007507 -1.82377540 1.21873133 3.68924383 -1.32552347 5.66901989e-01 2.16897704 0
4.15433837e-01 2.16441717 -1.60139711 2.65209875e-01 7.77421222e-01 -1.57758178 -5.59605702e-01 6.97650336e-01 9.18129264e-01 7.52868966e-02 9.36437016e-01 -7.26330546e-02 0
9.33163363e-01 1.12383909 -3.36796402e-02 1.17777024 1.11081539e-01 1.45826315e-01 3.89457181e-01 -6.17801382e-01 2.11921643 1.12971364e-01 2.00204848 2.50715539 0
-3.21169596 7.44209603e-01 2.10952598 -1.72021983 3.97903388e-01 1.92370695 3.99035016 -1.25250273 -1.36679020 -9.63531549e-01 -1.76338515 -3.37937986 1
1.04116864 -5.60658734 -8.14189974e-01 -1.12964269 -1.41118754 1.20654583 -4.16509926e-01 1.67311414 -1.65361401 -4.85691485e-01 4.92103140e-01 2.21492084 0
1.43873126 -2.73681159 1.20846791 -8.51985246e-02 4.88444044e-01 -1.11489906 -1.53916118 1.26008384 -1.00781372 -1.23831067 -5.82889416e-01 8.53040242e-02 0
-1.50139655 2.00557845 3.34495731 -2.51211565 -4.28963465e-01 -2.19134429 -9.24418716e-01 6.52437789e-01 4.96052082e-01 -1.17352793 -1.92322159 -1.35631068 1
-1.08151973 -1.85230845 2.86778176e-01 -1.69395789 8.29921609e-02 1.65228616 2.27340126 1.14837187 1.27143425e-01 -1.14173433 -2.12017607e-01 8.94119328e-01 1
2.97954854e-01 -1.27670821 3.78387695 2.68967459 -1.40000730 -2.94450251e-02 -1.15271809 -1.96678259 8.72427410e-01 -1.49336937 4.59175007 -1.08759349 0
-5.56427238e-01 2.54783776e-01 -4.16050945e-01 2.06033636 1.85733010 -2.09502247 1.28829369 -9.83868101e-01 -3.74586242e-01 1.57401812 2.08720260 1.62141098 1
2.67427951e-02 7.37631258e-01 -1.44256044 9.33160575e-01 -1.16273112 -2.16269098e-01 -6.02732964e-01 -2.30960268 -1.19407761 2.03530608 9.25870000e-01 -9.49762202e-01 1
-4.34754024e-01 4.48937145 2.02333416 -4.76022038e-01 -1.87536605 1.25087644 -8.98808233e-01 -9.95346181e-02 3.80850111e-01 -4.63671489e-01 -3.01830297 -3.21889436 0
5.93653569e-01 2.40701102e-01 9.36238621e-01 1.50946564e-01 -2.13759786 1.49295816 -3.15730312e-01 1.65581112 1.85506760 -1.64725296 8.16107673e-01 8.30632448e-02 0
1.41195790 5.11646841e-01 3.66143521e-01 3.14544456 -3.12919138 2.72340091 1.58931544 -1.84128260 -2.56557535e-01 1.47925825 3.17911927 -1.49156369 1
1.15142040e-01 4.13453021e-01 -1.97274952 -1.63456320e-01 -1.54612003 9.92433623e-01 1.82566930 2.05431064 -2.65458583 1.48648930 -3.44822763e-01 9.62405732e-01 1
3.37484879 1.41096014 -9.47525008e-01 1.42576570 1.13266800 -2.33114818 -3.14614001 3.01109722e-01 2.90114554 -3.60848460e-01 2.36508207 3.89261880 0
-3.89965626 4.22934202 -3.89861361e-01 -1.99037480 1.44990976e-01 3.89374446e-01 -1.24167715 3.19016428e-01 -4.33073097 3.01482404 -7.37221425 -2.30335361 0
7.94350039e-01 3.43422608e-01 1.00909708 -1.57234724 2.46498143 -2.93643277 1.42869950 2.45891716e-01 2.26498941 -2.82827935 2.13195742 4.51520547e-02 0
-1.46350452 -1.68244236e-01 -7.25391097e-01 2.23358537 -2.16220990e-01 9.16727106e-01 6.41077643e-01 -1.63429135 -3.39445546 1.48021692 6.02291799e-01 -1.07421229 1
-2.03126725 9.61440712e-01 -1.91739883 1.88286079 2.66097819 2.92949524e-01 1.76122521e-01 1.29265969 -2.54021225 -6.22967517e-02 -3.48805754 -1.93215934 1
4.81278869e-01 -4.26681533 1.37135121 4.32201565 -5.70069576e-01 2.22770166 -1.86511487 -3.02177507 -1.35530228 -9.84459858e-01 5.38026365 -8.66767241e-01 1
-1.41849085 1.12353807 4.51147377 -3.56217480 -2.10557627 1.25361300 2.60120393 8.71615529e-01 5.24687766e-01 -6.75519699e-01 -3.23506795 -1.93004516 1
-5.84452725e-01 -2.18306493 1.37567527 6.05895462e-01 1.09971454 -9.46591694e-01 -8.67026496e-01 -2.68518059 -3.23195520e-01 -9.00955302e-01 2.31535905 -9.05065818e-02 1
1.51094890 -5.90747724e-01 -8.14252725e-01 1.64024944e-01 -1.45762865 8.60053386e-02 -1.65631118 1.67011639 -1.48299729 -6.00542541e-01 -9.00395798e-01 -4.27510051e-01 0
4.78498395e-01 -8.82750092e-01 8.06713505e-01 1.55528843 2.29411673 -2.46964201 -8.43021071e-01 -3.57437514 8.26258614e-01 -7.70455408e-01 4.42997898 1.53379574 1
4.17703501e-01 3.74599626 7.00744082e-01 -4.58008987e-03 -7.45831706e-01 8.99324071e-01 1.05570049 1.56111782 7.91853922e-01 -1.10687897 -1.93980664 -3.81999754 1
1.60357816 -2.63911840 -1.21608631 6.33754555e-01 3.43375848e-01 -3.19820223e-02 1.72166232 1.20792687 6.74585652e-01 9.06037567e-01 1.19211647 3.14972852 0
-5.44356657e-01 4.77044698 9.91897242e-01 -7.32455140e-02 -4.00896458 1.40831371 -2.42512600 -1.55948113 -1.86304995 3.15750920 -1.67025768 -1.74118388 1
1.18432796 -1.43940684 -3.11369665 1.95345173 -1.62649411e-01 1.49965523 -1.01630373 1.20819020 -1.15734959 2.01246811 2.73009363 3.90120953 0
8.70000616e-01 -2.12632378e-01 1.57336982e-01 -1.92877438 1.88142525 -1.97379033 8.09770289e-01 -1.14245941 3.30719083 2.50672385 -7.78794967e-01 5.78595966 1
-1.39598825 -6.15108078e-01 5.88924559e-01 1.99183656 -1.23520238 3.25170339e-01 -1.72577518 -2.07839834 -3.12877250 -6.29892180e-01 2.49826591 -3.75260875 1
2.37303931 -6.19716628e-01 -9.66419164e-01 3.31410920 -3.18309060 4.14848835 -3.65223356e-01 -2.80660958e-01 8.79105865e-01 -7.91441395e-01 5.28566901 2.15464801e-01 1
7.40636972e-01 -3.47734722 -3.12292699 2.28847593 2.13836292 9.34066330e-01 8.08968878e-01 1.62897592 -3.77156845e-01 3.88708409e-01 -7.03585742e-01 2.16686220 1
-1.00032608 -9.70775457e-01 8.49845920e-01 2.59159932 1.21900765 2.42140455e-01 2.48804895 -2.79519961 -4.39887556e-01 -1.84211046e-01 2.99459212 -2.08396613 1
-2.40565885 2.08007991 8.06139167e-01 -1.07166055 1.10462938 -1.36210210 -2.13905376e-02 -1.40433108 -6.47430218e-01 8.00126413e-01 -1.10226494 -1.85998223 1
1.22772350 6.29655107e-01 -4.55809669 2.29649529e-01 -8.99841737e-01 1.21718459 4.71484606e-01 6.45679507e-01 -2.01758033 1.83160875 7.96152191e-02 8.36298119e-01 1
-5.74542888e-01 -6.08573127e-01 -1.76801860 -1.67378331 1.36908667 3.23029954e-01 -1.43667466 -5.44866292e-01 -1.49599287 2.30920321 -2.26249676 2.89150805 0
-7.24894853e-01 -3.22223246 7.99269796e-01 -1.63662172 1.94510184 -1.09986575 4.58036960e-01 -1.15000023e-01 -1.57773616 1.48660437 -2.64844182 2.86166838 0
1.23982406 4.59089181e-01 7.76841316e-01 1.25171098 2.08979353 -1.33607871 -7.66265436e-01 -1.06325701e-01 2.24292937 -3.69163766e-01 -5.58538439e-01 1.25378620 0
-2.56481369 -1.53409616 3.54153672 2.31056529 -2.21308639 7.56854975e-01 -6.73976334e-01 -2.13994296 -3.57229521 6.88053823e-01 1.86878050 -2.54076420 0
3.48125209e-01 -3.26411204 -2.67893724 8.86448177e-01 2.97851635 7.25598187e-01 2.67760841 1.34255617 -1.34737565 -1.70538901e-01 -7.13595504e-01 1.57750197 0
2.41223384 7.53368970e-01 -3.15058264 5.58281954e-01 -1.48327515 2.37221669e-03 -2.75980458 2.80860540 -1.22907727 2.68183651e-01 -2.32228947 -1.65279853 0
2.94339296e-01 3.76408217 5.28620863e-01 -1.42830631e-02 -1.45162658 -1.76980783e-01 1.27612705 -1.32343210e-01 -7.51706081e-01 5.26546853e-02 7.23092212e-01 -3.85140049 1
-1.28664730 -1.97108906 -1.36586710e-01 3.35326326 9.84355120e-01 -5.34587417e-01 -2.33207643 -2.39976774 -2.27221571 -6.68165185e-01 1.47789838 -2.18344518 1
-1.87880798 2.08569222 3.56746803e-01 -9.23023261e-01 1.36791649 -1.18444335 2.49763306 -2.20536405 -1.36556515e-01 2.41822464 -2.12285168 -1.96048123 0
-1.64915803 -2.47264506 3.55676283 -4.76082493e-01 2.81232000e-01 7.55290464e-01 -7.19617760e-01 -1.30774534 -5.27444700e-01 -4.72019228e-01 8.04685572e-01 5.78090294e-01 1
2.39198106e-01 6.86738712e-01 1.42381679 -1.22774451 -1.61102555 6.25575060e-01 -1.52775092 -1.42485467 1.63141258 9.48887590e-01 6.06461571e-02 2.80981926 0
-1.05163638 1.95390753 1.92668076 -8.26504455e-01 -2.83147153 3.23399944 2.21343434 -6.07850778e-01 3.65722795e-01 2.97594522e-01 8.43209731e-02 -1.28260526 1
-3.14050483e-01 2.78104314 -7.83664459e-01 -1.72832438 1.16827748 -2.04136207 1.59667265 1.80356215 1.44639442e-01 4.88340768e-01 -2.35477521 -1.03756311e-01 0
1.54961048 1.20367609e-01 -4.18633131 1.88695302 7.72416820e-01 5.25091937e-01 -1.48793879 -1.04236692 1.33465922 1.00755242 4.82347600 4.08071022 0
1.95759218 -3.84860122 1.04048691 -1.53245714e-02 -7.12872878e-01 -2.55745657e-01 -5.03235710e-01 4.44001457e-01 1.75375391 -4.40520833 3.59428245 -1.48300290 0
-9.15585307e-01 3.90272564 -1.32066195 -1.47132915e-02 -1.76695278 7.75767513e-01 -1.77198010 5.08819490e-01 -7.09416100e-01 5.46901260e-01 -2.36069612 -3.23985468 1
1.27969854 -1.68851361 1.76246578 -5.51777281e-01 7.39978271e-01 5.70102715e-01 4.10949652e-01 -1.45450692 6.06376286 -1.51763757 3.45595413 4.17792055 0
-2.37507952e-03 3.27291308 -1.74457889 -3.48786906e-01 -7.30684017e-01 -1.93536195 1.38487081e-02 1.56107433 -3.37955854 2.75820164 -3.84182944 -1.97048840 0
-8.87890412e-01 1.98006543 -1.93529191 -1.03887921 -9.22692785e-02 -1.40348058 -1.48990800 2.17483476 -4.95248900 2.06232819 -3.05169373 -6.40623205e-01 0
-8.61954461e-01 1.84867822 -1.37718019 2.56505934 2.09021981e-01 9.14121965e-01 2.06445999 1.50515771 -2.85967204 -3.86863194e-01 -7.95136963e-01 -3.95386206 1
4.33868290e-01 5.78749489e-01 -4.87272956e-01 2.15833547 5.60177987e-01 2.83736364 2.42992551 -3.63015474e-01 2.51008173 1.67543627 -5.21621685e-01 2.23277819 0
2.61924952 -2.88332767 7.11302827e-01 -2.03238326 -2.27718116e-01 -1.91966813 1.16909366 -8.95343429e-01 2.68464271 -2.20531803 3.41888485 -9.83299436e-02 0
-3.83849742e-01 -3.69088517 8.49071826e-01 -1.75140219 2.36764524 -1.63597529 2.26076011 -6.19555933e-02 6.10005455e-01 -2.71925901 2.22624934 7.31406971e-01 0
1.32672279e-01 1.06158817 4.72800099e-01 -8.70747345e-01 1.52003425 -1.40246662 -1.73008119 -8.11942439e-01 1.92012365 7.83008379e-01 -1.45059603 2.70034753 1
-2.26805833 4.51436997e-01 -1.44413196 -2.81134361 2.25246957 -1.23801758 1.73425756 4.93569420e-02 4.18315397e-01 2.18708543 -2.71121909 4.07006826 0
1.67918821 -2.14327152 -6.77399467e-01 9.38956054e-01 -1.56750862 1.76170320 2.20348137 1.38294603 1.69332653 -1.89536324 4.79633531 1.28656419 0
-9.77619422e-01 -5.88634592e-01 2.91191341e-01 2.09438834 -1.13105574 1.52685247 -3.08579997 -2.26792123 -7.33922011e-01 1.49417664 2.19799001 2.40726741 0
1.06630829 1.18715043e-01 -1.26370675 -1.79883160e-01 -1.01493947e-01 6.97678944e-01 -1.11846369 1.06722436 8.49373309e-01 7.11225054e-01 8.91624732e-01 2.66697401 1
1.29467923 -2.31250843 -1.01682113e-01 2.05739960 -1.64065797 3.08104861 -1.60108425 1.80031874e-01 5.80384486e-01 -2.78387800e-02 2.63656170 1.12710995 1
-8.76223689e-01 3.57608171 3.51534533 5.49601953e-03 -1.94715603 -1.67225258e-01 -2.07739385 -2.74262177 8.75737179e-01 1.11929886 1.18279556 6.10525115e-01 1
-1.20468574 -3.34579935 2.24193625 6.33315947e-01 1.28030592 -1.21891025 1.70461806 -8.51822534e-01 -9.03027486e-01 -2.04684363 2.11511948 -3.18820317 1
1.29156290 -2.80081078 -1.08142002 -1.56988022 -2.01918051e-01 -4.90260955e-02 -1.42761186e-01 -4.25550761e-01 1.28324374 2.28216365 -1.07576396 3.64120589 1
2.61127113 -1.44575773 -2.06164581 -2.37769312e-01 -5.85826243e-01 9.53046591e-02 -2.93503511 -5.72617954e-01 -2.12989816e-01 2.26835663 1.91384574 4.83561562 0
-5.57692689e-01 3.67759178e-01 -1.26834915 -7.55818610e-01 -1.66897254 4.82863532e-01 -8.51434204e-01 -1.42310382 -2.87231184 2.49399013 -1.03816959 2.70202786e-01 1
-1.43235566e-01 -2.00781797 1.62491628 -1.34472170 1.59773885 -1.38907674 2.05260387 -3.24611278e-01 2.82740361 -3.11273239 -6.96509195e-01 -1.89642157 1
-8.73274843e-01 -7.22858065e-01 2.73336142e-01 -2.78582242e-01 -5.13281633e-01 -5.27155495e-01 -4.69741866e-01 -3.03334800 1.15243197e-01 3.40730442e-01 1.03341115 -1.52159675 1
-6.04743376e-01 5.27854064e-01 1.23636534e-01 2.78227571 1.36819525 1.69942410 2.10243765e-01 -1.42782673 1.55767092 1.38179283 -1.26268699 2.54667203e-01 1
-1.15952374 4.37504002 -1.87857437 -1.17540651 -3.24860084 6.74903434e-01 -1.80945188e-01 1.67986695 -2.94755955 2.43310720 -2.73451121 -1.78479109 1
-2.65649245 -2.64964978 1.86332059 -1.15010068 8.76033039e-01 -6.33672690e-01 2.78339118 -1.07732331 -1.25526415 -6.92940643e-02 -1.50741809 -2.31476059 0
-1.10507172 1.66837115 4.08935625 1.09007506 -3.64870093 2.98688416 1.38443920 -3.29975999 -1.24107270e-01 1.85795516 1.41856448 -2.35004579 1
-2.43651471e-02 1.36885794 8.36208965e-01 -1.13095254 -1.76528719e-02 -1.54511658 2.14733218 -2.15958888 6.28326195e-01 1.61576338 -8.39437575e-01 -1.89047352 1
-2.60537112 1.03862798 7.56948173e-01 1.80372935 -1.01803246 2.20864022 -6.30274511e-01 -3.94980756e-01 -4.95903806 1.06594820 -4.07836945e-01 -3.57675144 1
1.07227288 -1.31582783 -1.96711656 4.84623207 1.94811009 6.38566330e-01 -6.62384072e-01 -8.16553811e-02 9.78944415e-01 6.75386575e-02 1.87933263 1.20793041 1
2.81213671 -7.33526131e-02 -1.55617135 1.70660527 -1.82330918 1.49472642 -2.86465990 -6.83497522e-01 -1.92916017e-01 2.33069690e-02 3.57352713 2.01587850 1
-1.14693706 5.12241805e-01 -1.21641363 -2.52390526e-01 6.40017671e-01 2.06424772 1.65884872 2.24988878 1.73338566 1.27692410e-01 -1.97429169 2.71957173 0
1.76659627 -1.00080249 9.62467375e-01 -1.71469409 4.16136545e-01 6.00663353e-02 2.61731589e-01 6.35545224e-01 4.30970187 -2.47413231 -5.41343845e-01 -1.21634320e-01 0
2.06152260 -1.98438081 -4.43614067e-01 2.92223129 8.62773565e-01 4.64850632e-02 -9.27688967e-01 -1.58076416 9.69689950e-01 1.78662144 2.44373579 3.94442455 1
2.29832652e-01 -2.28226848 -8.30809980e-01 -1.84365161 2.29660392e-01 -5.32121014e-01 4.05828536 8.77470433e-01 2.79406173e-01 1.13734684 6.62400789e-01 1.46427893 0
-1.69409303 6.19818894e-01 6.76135719e-01 -1.00039424 -1.01450678 1.94953202 3.88414276 -1.38209426 -1.03516436 1.50139576 -1.42656092 -1.61049847 1
1.37487310 -3.45669559 -1.20897388 3.00161755e-01 3.87736463 -3.36556998 -2.34989018 -4.97151350e-01 1.06915260 4.79405118e-01 2.32910372 6.03588957 0
-1.23902154 -7.75212650e-01 2.62258890 6.41512329e-01 1.62628168 -2.05174417 1.11431601 -1.80347528 1.33565856 -3.70455657e-01 -1.30500472e-01 -8.20318873e-01 0
3.05164717e-01 1.45431729 -9.01410849e-01 1.07253409 -2.25960775 1.48396673 -2.29014832 1.66180500 -3.99835121 8.10041924e-01 -2.67248952 -1.43182879 0
9.69103471e-01 -4.70043234 1.22067692 2.34565013e-01 2.63267182 -2.43468370 -2.37811794 -1.71653307e-02 9.93317620e-01 -3.65277062 2.46657669 -3.05195491e-01 0
1.93087718e-01 -1.65076896 -5.34367624e-01 -1.55868832 -2.53847391e-01 1.26683150 -1.09716744 -7.46140081e-01 -4.42227865e-01 1.78706030 3.65699942e-01 3.41268373 0
-2.43509667 1.96983168 1.47725203 -4.59597030e-01 3.08468298e-01 1.93580436 5.39626284e-01 7.45044944e-01 -9.79041607e-01 -1.34284668 -3.76823783 -4.14962701 1
5.50113185e-01 4.18857728e-01 1.54540311 -4.58174676e-01 -6.26742135e-02 1.06416396 -1.33842382 -1.38675209e-01 3.09305480 -3.59229225 2.04701392 5.18068731e-02 1
-2.39750332e-01 -3.09122018 -1.18792562 3.35025934 1.89650335 1.37174594 1.97730897 4.72461707e-01 -8.33671837e-01 -5.30964823e-01 1.19281560 -9.21920716e-01 1
-1.64544521 4.47727076 1.99869376 -9.13659795e-01 -1.12597061 -6.13370377e-01 2.59311269e-01 -1.80795254 -4.25330241e-01 2.60794182 -4.76795787e-01 4.60817449e-02 1
-1.93438405 -6.07003263e-01 2.95204164 -1.03432418 -2.45119428 -5.33340835e-01 -1.11916468 -8.48563766e-01 -4.28171473 5.11495481e-01 -8.03585399e-01 -3.84744859 0
8.77415195e-02 7.78771215e-02 1.95099232 -3.63690811 -1.34804927 -1.57050050 -1.08967421 -1.03230000 2.88106388e-01 4.61090480e-01 5.26708767e-01 1.98902936 1