Multi-output regression with low dimension input using ANN

Hi all, I am new to artificial neural network. Currently I am trying to solve a regression problem with 3 input variables but the ouptput dimension is around 40. I cannot get any acceptable fitting results after several trials.

  1. Is there any general rules in dealing with this kind of regression problem, such as how to choose activation function, number of layers and neurons in each layer? I used 3 layers with about 500 neurons in each layer, and Relu activation in all layers. It seems to improve very little when I increase the model complexity.
  2. Apart from neural network, is there any other tools suitable for this problems?


Wow, 3 inps -> 40 outputs!

For sanity check, have you tried training 40 independent linear regression models (using sklearn may be)?

Thanks for your kind reply.
Using several independent model should be a solution, but there is some inner relations among the elements in the output vector, which I wish to modify the error function to account for this.
Currently I simply use the square error. I am wondering if there is some fundamental limit when mapping small number of input to large number of output when doing regression using neural networks?

That’s a lot actually. Get nn.Linear(3, 40) (which is same as 40 independent regressors) working and incrementally add hidden layers with a few number of units. I don’t expect MLPs to ‘just work’.

Thank you chsasank. You mean that MLP can not well handle the regression with oupt dimension much larger than the input? Why? Any other suggestions about my problem?
I am looking forward to your kind reply.

Deep MLPs are not so easy to train. Because of not so great gradient flow.