Loss value decreases slowly

Hi everyone,

I have an issue with my UNet model, in the upsampling stage, I concatenated convolution layers with some layers that I created, for some reason my loss function decreases very slowly, after 40-50 epochs my image disappeared and I got a plane image with some pixels pattern. Are there any suggestions on what it might be, and how can I fix it?
I used perceptual loss combined with MSE loss
torch.optim.SGD(prior_weights, lr=0.00001, momentum=0.9, weight_decay=0.0001)
Epoch: 0 Loss: 0.0021450244821608067
Epoch: 1 Loss: 0.002144632861018181
Epoch: 2 Loss: 0.0021442414727061987
Epoch: 3 Loss: 0.002143850550055504
Epoch: 4 Loss: 0.0021434600930660963
Epoch: 5 Loss: 0.0021430705673992634
Epoch: 6 Loss: 0.0021426803432404995
Epoch: 7 Loss: 0.0021422908175736666
Epoch: 8 Loss: 0.002141902456060052
Epoch: 9 Loss: 0.002141514327377081
Epoch: 10 Loss: 0.00214112619869411
Epoch: 11 Loss: 0.0021407371386885643
Epoch: 12 Loss: 0.0021403473801910877

It’s a bit hard to speculate what might be the root cause of this issue.
I would generally recommend to try to overfit a small dataset (e.g. just 10 samples) by playing around with some hyperparamters and make sure your current model as well as training setup is able to do so.
Once this is done you could scale up the use case again.