CTCLoss: get alpha and beta


Any chance to be able to easily retrieve the alpha and beta (or gamma) tensors after computing the CTCLoss ?
The objective is to exploit them in another loss function that would force the alignment to be balanced across the sequence of output states (the number of input timesteps assigned to each output state should be nearly the same for all states).

Thank you !

Hi Christophe,

I don’t have the answer on retrieving alpha and beta prob tensors computed in the CTC loss which in turn uses the famous forward_backward algorithm used in the HMM-GMM model parameter estimation. However, reading your final objective of forcing same number of time steps in each state, you don’t need a HMM and hence CTC like loss for that as it is designed to do precisely the opposite of that i.e dynamic time warping. If you still want to force equal number of time-steps in each state, you can use simpler model which does uniform segmentation of your input time series to your target sequences states and you woulnd’t need CTC loss and the alpha beta prob tensors.

Right, in fact, I want to use the “uniform segmentation” criterion as a kind of regularizer for the CTC loss. I still want to be able to train the model with flexible alignments with CTC, but I basically want to avoid degenerated cases where “one state takes all”. Of course, this may be achieved through careful initialization, but a regularizer would be more powerful than simple uniform initialization, which influence decreases with more training epochs.
Thanks for your reply !