Choose the correct optimizer and parameters

I’m new to neural networks and i am trying to train my first neural network with pytorch.

I was working with matlab before and used the toolbox but i want to explore neural networks more and try to improve the network further and matlab does not have a lot of options.

I have just an CSV with input data (2 inputs) and output data(4 outputs) so I am not training with images.

I made a simple network with 1 hidden layer.

When i train the system in matlab i got an mse of about 6e-5 and now i train with pytorch i get about 8e-3 when training with adam optimizer.

What i always ask myself. How do you choose values(like batch size, betas,lr, epochs) for the optimizer and is adam the correct choice for this problem? Are the values just chosen by trial and error?

Neural nets training is probabilistic by nature because most of optimizers are and you shuffle your train_set, etc. Implementation may ( and surely) differ from MATLAB to PyTorch hence it is very unlikely you get same exact results, but comparable order of magnitude. Choosing hyper-parameters (batch_size, betas, lr etc.) mainly relies on experience. That said for some well-studied topics they are some recommendation (e.g Adam works well in NLP tasks, SGD works well on average etc.). You can also control dynamically the learning rate ( works well for some problems) or optimize some of your hyper-params using the package ray[tune]. Trial and error should be great to begin.