I am doing a regression problem, and right now facing a problem where my model can only be trained with small amount of data, but when I increase the amount of data, the network does not learn(loss does not decreasing).
Does anyone ever meet the same problem as me?
I don’t know how small the change of loss but we can typically observe that with a small model.
It means your model is not enough big to fit the larger dataset.
Scale up your model and show how it changes
the loss I used is L1 Loss, and it started from around 0.4 and can only decrease to 0.3
How small the loss value cannot exactly explain how good your model is.
If you use
nn.L1Loss, check whether you are averaging the loss by the size of batch.
Anyway, scale up your model first.
yeah, I mean it is supposed to be decreasing to 0.01. But it could not. By the way, scaling model is like adding more parameters?
Yes, stacking more layers or somehow you already know