Model's loss started at low value and converged too fast

Hi everyone,
I am new to pytorch and after some tutorials, I tried to mimic a source github about Flor HTR architecture (you can search this git in the internet). This source used tensorflow and I want to test the flor architecture in pytorch. However, when I train the model, my CTCloss always goes from around 10 then down to around 3 and 2. I has tried some methods like:

Thank you,
If you have any question, I am happy to answer.

One of the things to watch out for is that CTCLoss expects you to input log probabilities, so you need to use log_softmax on the input argument. Getting negative values may originate from a violation of this constraint. (And after I checked, it looks like you use softmax instead from a cursory look.)

Best regards

Thomas

Thanks tom for the fast reply. However, I am using Logsoftmax at the moment so the negative values have not appeared again. You can see it in my code that I assign self.softmax = nn.LogSoftmax(dim=-1) at the code I marked with model.py. So I think I use the right way for the CTCloss but my loss value still start at 10 and go down to 3. I am happy to hear if you find out anything may cause the problem.

Thank you