Which is the best optimizer for non linear regression?

Im searching for a good optimizer for pytorch. My NN is a numeric non linear regression(not classification), with 3 neuron input, 6 in hidden layer and 8 in output layer.
which is the best optimizer for non linear regression?

If you have that few parameters, you could try LBFGS. I use that e.g. for Gaussian Processes.
Note that it necessarily needs a closure (re-) evaluating the model.

Here is a snippet that I use with GPs (which are a bit special in that they take the data and loss into the model, you’d have to feed data into m and probably also go from predictions to loss - called objective here), which should give you the idea:

opt = torch.optim.LBFGS(m.parameters(), lr=1e-2, max_iter=40)
def eval_model():
    obj = m()
    opt.zero_grad()
    obj.backward()
    return obj

for i in range(50):
    obj = m()
    opt.zero_grad()
    obj.backward()
    opt.step(eval_model)
    if i % 5==0:
        print(i,':',obj.item())

Best regards

Thomas

Thanks. Im using your code but give me error:

N, D_in, H, D_out = 3, 1000, 6, 8

# Create random Tensors to hold inputs and outputs
x = torch.randn(N, D_in)
y = torch.randn(N, D_out)

def loss_fn(output, target):
    loss = torch.mean((output - target)**4)
    return loss

model= torch.nn.Sequential(
    torch.nn.Linear(D_in, H),
    torch.nn.ReLU(),
    torch.nn.Linear(H, D_out),)


opt = torch.optim.LBFGS(model.parameters(), lr=1e-2, max_iter=40)

def eval_model() :
    obj = model(input,output)
    opt.zero_grad()
    obj.backward()
    return obj

for i in range(50):
    obj = model(x, y)
    opt.zero_grad()
    obj.backward()
    opt.step(eval_model)
    if i % 5==0:
        print(i,':',obj.item())

error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-19-36374f9ee179> in <module>
     36 
     37 for i in range(50):
---> 38     obj = model(x,y )
     39     opt.zero_grad()
     40     obj.backward()

~\Anaconda3\lib\site-packages\torch\nn\modules\module.py in __call__(self, *input, **kwargs)
    539             result = self._slow_forward(*input, **kwargs)
    540         else:
--> 541             result = self.forward(*input, **kwargs)
    542         for hook in self._forward_hooks.values():
    543             hook_result = hook(self, input, result)

TypeError: forward() takes 2 positional arguments but 3 were given


i dont know why??

Very likely, you want to split that into pred = model(x) ; loss = loss_fn(x, target) (loss being more common than obj here).