'int' object is not callable on my custom loss funciton

im trying to implement my loss function but having a problem. i tried to implement many different variable types such as long/float/int for ‘‘int’ object is not callable’’ . i cant understand where the problem is caused

labels.size = torch.Size([4, 11, 224, 224] , <class 'torch.Tensor'>
output.size = ([4, 11, 224, 224]) ,<class 'torch.Tensor'>

My Custom Loss Function

DSC = 2|XnY|/(|x|+|y|)

class Dice_Loss(torch.nn.Module):
    def __init__(self):
    def forward(self,x,y):
        x = x.view(-1,dtype = torch.float)
        y = y.view(-1,dtype = torch.float)
        square_x = torch.sqrt(torch.square(x))
        square_y = torch.sqrt(torch.square(y))
        intersection = 2*torch.mul(square_x,square_y)
        #multipication = torch.sum(torch.sqrt(torch.square(x)),torch.sqrt(torch.square(y)))
        multipication = square_x.add(square_y)
        return (intersection/multipication)

apply for

dice_loss = Dice_Loss.apply or Dice_Loss()


TypeError                                 Traceback (most recent call last)
<ipython-input-47-04945a48ff0c> in <module>()
     14     print(output.size())
     15     print(labels.size())
---> 16     loss_dice = dice_loss(labels,output)
     17     loss_dice.backward()
     18     optim.step()

TypeError: 'int' object is not callable


It is not about data types you are using, I think it’s just a typo. Somewhere else in your code you are using dice_loss but it is no longer referring to Dice_Loss class.
Error says dice_loss should be an instance of Dice_Loss is a int.

It is pretty simple to find errors like this, just run your code in debug mode and add dice_loss to watch and check its type after dice_loss = Dice_Loss.apply or Dice_Loss()


Hİ nikronic, thanks for comment. Actually i referred to this;

dice_loss = Dice_Loss()

and used this

loss_dice= dice_loss(labels,output)

When I debug it by eyes again i couldn’t find any problem or i am quite careless

Maybe I am missing something but for sure your loss class is correct (at least in case of your error) and another point is that you have syntax error in your class, view does not support dtype. So, if you pass anything to dice_loss then you should get error in the first line or even in a simpler scenario, if you pass nothing to dice_loss like:

loss = dice_loss()

you should get missing argument error but you are not getting it because somehow somewhere you are changing dice_loss.

You are right.

i changed that ; (i specified the place with “**”)

class Dice_Loss(torch.nn.Module):
    **def __init__(self,x,y):**
**        super(Dice_Loss,self).__init__()**
**        self.x = x**
**        self.y = y**
    def forward(self,x,y):
        self.x = x.view(-1)
        self.y = y.view(-1)
        square_x = torch.sqrt(torch.square(self.x))
        square_y = torch.sqrt(torch.square(self.y))
        intersection = 2*torch.mul(square_x,square_y)
        #multipication = torch.sum(torch.sqrt(torch.square(x)),torch.sqrt(torch.square(y)))
        multipication = square_x.add(square_y)
        return (intersection/multipication)


loss_dice = Dice_Loss(labels,output)

new trace

ModuleAttributeError                      Traceback (most recent call last)
<ipython-input-16-41ead3ef64fb> in <module>()
     15     print(labels.size(),type(labels))
     16     loss_dice = Dice_Loss(labels,output)
---> 17     loss_dice.backward()
     18     optim.step()
     19     running_loss += (loss.item()/len(images))

/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py in __getattr__(self, name)
    777                 return modules[name]
    778         raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
--> 779             type(self).__name__, name))
    781     def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:

ModuleAttributeError: 'Dice_Loss' object has no attribute 'backward'

i directly used Dice_Loss function in traning process but now that Dice_Loss object has no attribute backward. That is true but should i use backward method in new version of pytorch ? (i have seen a few comments without backward method ) i can’t find example about this method . Do you have an example about custom loss function backward method ?

You should not do that!
You only need to define forward method and pytorch will take care of backward for you.

I am still insisting on the point that you are having a small typo or something because I can use your first code and everything is just fine!

I’m sorry I misunderstood nikronic .I’ll check my code over and over again. Can you let me know if you see a mistake?

Frankly I do not see any errors with the lines you have shared. I just ran your first post and it works. Do you want to create a notebook on google colab with a toy example and share the link so I can take a deeper look?

1 Like