Anaconda Stopping kernel When I use my Customized Cost Function

Hi. For my final project, I should define a Cost Function which gets 2 inputs. one of them is a 3D Tensor with size 3077 and another one has size NumberOfObjectInImage*5. Actually, My project is related to a detection problem. NumberOfObjectInImage denotes to the number of objects which are in one image. The 5 number denotes to the Bounding Box properties. And here is my code and I should say that I just have implemented the forward path of the cost.(It is based on Autograd Extension Tutorial):

class MyLoss(Function):
def __init__(self, S, B, l_coord, l_nobj):
    super(MyLoss, self).__init__()
    self.S = S # Number of Cell
    self.B = B # Number of Bouning Box
    self.l_coord = l_coord
    self.l_nobj = l_nobj

def forward(self, pred_out, real_out):
    # pred_out: is 30*7*7
    # real_out: is NumObject*5
    self.save_for_backward(pred_out, real_out)
    po = torch.LongTensor([2]).float()
    sum = torch.sum
    pow = torch.pow
    sqr = torch.sqrt
    print(type(pred_out))
    rt = real_out # Real_out
    pt = pred_out # Pred_out
    numObj = rt.size()[0]
    print(numObj)
    interval = np.linspace(0, 1, self.S + 1)
    cost = torch.FloatTensor([0])
    for index in range(numObj):
        cls = rt[index,0]
        x = rt[index,1]
        y = rt[index,2]
        w = rt[index,3]
        h = rt[index,4]
        # Original Ground Truth
        box1 = (x-(w/2), y-(h/2), x+(w/2), h+(h/2))
        # Select cell
        colS = self.indices(interval, lambda q: q > x)[0]-1
        rowS = self.indices(interval, lambda q: q > y)[0]-1
        # Select BBox
        IOU = np.ndarray(shape=(1,B))
        for ind in range(B):
            px = pt[0, 0 + (5*ind),rowS, colS]
            py = pt[0, 1 + (5*ind),rowS, colS]
            pw = pt[0, 2 + (5*ind),rowS, colS]
            ph = pt[0, 3 + (5*ind),rowS, colS]
            box2 = (px - (pw/2), py - (ph/2), px + (pw/2), py +(ph/2))
            IOU[0,ind] = bb_intersection_over_union(box1, box2)
        # Select Best BBoc
        sel = IOU.argmax()
        x_hat = pt[0, 0 + (5*sel),rowS, colS]
        y_hat = pt[0, 1 + (5*sel),rowS, colS]
        w_hat = pt[0, 2 + (5*sel),rowS, colS]
        h_hat = pt[0, 3 + (5*sel),rowS, colS]
        c_hat_obj = pt[0, 4 + (5*sel),rowS, colS]
        if sel == 0:
            c_hat_noobj = pt[0, 4 + (5),rowS, colS]
        else:
            c_hat_noobj = pt[0, 4 + (0),rowS, colS]
        p = torch.zeros(1,20).view(-1)
        p[int(cls)] = 1
        p_hat = pt[0,10:,rowS, colS]
        cost1 = self.l_coord*(pow(x-x_hat, po)) + self.l_coord*(pow(y-y_hat, po))
        print("cost1:", cost1)
        cost2 = pow(1-c_hat_obj,po) + self.l_nobj*pow(0-c_hat_noobj,po)
        print("cost2:", cost2)
        cost3 = self.l_coord*(pow(sqr(torch.FloatTensor([w]))-sqr(torch.FloatTensor([w_hat])),po)) + self.l_coord*(pow(sqr(torch.FloatTensor([h]))-sqr(torch.FloatTensor([h_hat])),po))
        cost += (cost1 + cost2 + cost3)
        del cost1, cost2, cost3, p
    return V(cost)

def backward(self, grad_cost):
    pred_out, real_out = self.saved_tensors
    grad_pred_out = grad_real_out = None
    return grad_pred_out, grad_real_out
        
def indices(self, a, func):
    return [i for (i, val) in enumerate(a) if func(val)]

I Would like to know this Error (The kernel appears to have died. It will restart automatically.) is related to Anaconda or related to my code. Could you please help me?
Thanks

Try running the code outside of an iPython notebook. It should print the full error then

@apaszke Segmentation Fault was its error! :slight_smile: :expressionless:

Try running this:

gdb python
<some output printed here>
> r your_script.py
<some more output. it will tell you that you got SIGSEV and drop into shell again>
> where
<a few lines looking like `#0 THPFloatTensor... `. Paste it in a GitHub gist
and put a link in this thread>

1 Like

@apaszke, Finally I have found the source of the error, and it was the return value of the forward function. Actually it should be a tensor value not a variable.

1 Like