Well, with a square loss, it’s a very brave linear regression:
If Y is your target, you want to minimize Loss( Net(X) - Y ) = ( Net(X) - Y )^2
with Net(X) = X.W + b
Then, if you add a rectifier unit to your model, and, more particularly a sigmoid:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.fc1 = nn.Linear(3*65536, n_classes, bias=True)
def forward(self, x):
x = self.fc1(x.view(x.size()[0],3*65536))
return F.sigmoid(x)
In that case you are much more on a logistic regression, since you forces your outputs to be 1 if it passes, 0 otherwise. Then, you minimize the mismatch between categorical values 0 and 1 of output and targets. This is a typical logistic regression.