CTC Loss - for all samples

Hello,

I am learning CTC Loss from CTCLoss — PyTorch 1.10.0 documentation,

If, I have 100 samples, then I got ctc loss for all samples or all samples in the batch

ctc_loss = torch.nn.CTCLoss()

loss = ctc_loss(input, target, input_lengths, target_lengths)

loss.backward()

print(loss) -> tensor(7.8303, grad_fn=<MeanBackward0>)

But, How should I get CTC Loss for every sample instead of getting CTC Loss for all samples or batch ?

Thanks

Use torch.nn.CTCLoss(reduction='none') to get the loss for each sample.

@ptrblck, thanks for the solution,

Do you know how to add weights to the loss when calculating CTC loss?

After using reduction='none' you would get the unreduced loss and could multiply it with your weights before reducing it to a scalar loss value.

yes, I figured that, thanks for helping @ptrblck :slight_smile: