Is CTC Loss doing it's job?

So my understanding of CTC is that it instructs the model to insert a blank token whenever it has to wait for the next acceptable target.
But I am not sure if the CTCLoss function in PyTorch is doing that?
This is an image of the posterior probabilities at each timestep for all the classes. The bottom most target ‘_’ is my blank class.

In my understanding, the probability for the blank class should be high more often than it is right now, in order to accept that the model is learning to use the blank token properly and CTCLoss is helping the model to learn to wait. Please tell me if this seems like CTC is working properly?