Hi,
I am trying to implement an operator that will change the labels for computing the loss function. The loss function is like this:
class OhemCELoss(nn.Module):
def __init__(self, thresh, gnore_lb=255):
super(OhemCELoss, self).__init__()
self.score_thresh = thresh
self.ignore_lb = ignore_lb
self.criteria = nn.CrossEntropyLoss(ignore_index=ignore_lb, reduction='mean')
def forward(self, logits, labels):
import ohem_cpp
n_min = (labels != self.ignore_lb).numel() // 16
labels = ohem_cpp.score_ohem_label(logits, labels,
self.ignore_lb, self.score_thresh, n_min).detach()
loss = self.criteria(logits, labels)
return loss
Here the ohem_cpp
is implemented according to the principle of the pytorch extension, and the ohem_cpp.score_ohem_label
will compute the softmax scores of the input logits, choose the scores lower than the thresh
and set the corresponding element of the labels
to the assigned ignore_lb
. I noticed that in pytorch1.6, users should add decorator of @custum_fwd
if they want to implement their own operators that is derived from autograd.Function
. I have no idea whether I can call my ohem_cpp
module directly as in the code above without potential errors. Did I miss anything here ?
I am asking this because I sometimes met the error of terminate called after throwing an instance of 'thrust::system::system_error'
(not often), I do not know whether is is associated with my wrongly using of my self-implemented operators.