KFrank
(K. Frank)
February 27, 2021, 2:53pm
2
Hi Dr. Noob!
AbsolutePytorchNoob:
I’m training a simple XOR neural network
self.fc1 = nn.Linear(2,2)
self.fc2 = nn.Linear(2,1)
You have a “hidden layer” with two “neurons.” I believe that this
does not have enough structure to capture XOR (and the quadratic
term embedded in it). Try adding a third hidden neuron:
self.fc1 = nn.Linear (2, 3)
self.fc2 = nn.Linear (3 ,1)
Please see this related post for some additional explanation:
I have a very simple example of 6 training data with 6 labels. I am trying to fit a simple logistic regression to it, and hoping it should over-fit with 0 loss, but the model does not converge.
Here is the rule, i am hoping the model to learn:
When input is [1., 0., 0., 1.] , the label is 1
When input is [0., 1., 1., 0.] , the label is 1
When input is [1., 0., 1., 0.] , the label is 0
When input is [0., 1., 0., 1.] , the label is 0
is such pattern something a non-linear model cannot capt…
Best.
K. Frank
1 Like