Hi all,
I was trying to use pytorch wrapper of CTCLoss function (GitHub - SeanNaren/warp-ctc: Pytorch Bindings for warp-ctc).
But it’s no works with actual master of pytorch. I run this sample code:
import torch
from torch.autograd import Variable
from warpctc_pytorch import CTCLoss
ctc_loss = CTCLoss()
probs = torch.FloatTensor([[[0.1, 0.6, 0.1, 0.1, 0.1], [0.1, 0.1, 0.6, 0.1, 0.1]]]).transpose(0, 1).contiguous()
labels = Variable(torch.IntTensor([1, 2]))
label_sizes = Variable(torch.IntTensor([2]))
probs_sizes = Variable(torch.IntTensor([2]))
probs = Variable(probs, requires_grad=True)
cost = ctc_loss(probs, labels, probs_sizes, label_sizes)
cost.backward()
And it’s broke with error:
Traceback (most recent call last):
File “ctc_test.py”, line 16, in
cost.backward()
File “/usr/local/lib/python3.5/dist-packages/torch/autograd/variable.py”, line 128, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File “/usr/local/lib/python3.5/dist-packages/torch/autograd/init.py”, line 83, in backward
variables, grad_variables, retain_graph, create_graph)
RuntimeError: expected Variable (got ‘torch.FloatTensor’)’
Anybody know how to fix this error?
Thanks in advance!