Binary Segmentation Using DeepV3


I am trying to do a binary segmentation task on the coco dataset using RestNet18 as encoder and DeepLabV3 as the decoder. Looking around i found that BCEWithLogitsLoss is the recommended criterion but i cannot get my training to work. Can you please help me proceed in the right direction ?

I am trying the following in the train method :
outputs = model(inputs)
loss = criterion(torch.nn.Sigmoid(outputs), labels)

The input to my network is torch.Size([32, 3, 256, 256])
The Output is torch.Size([32, 2, 256, 256])
and the labels are torch.Size([32, 1, 256, 256])

I cannot get the sigmoid to work, earlier i tried the CrossEntropyLoss with 2 classes but i doubt if it even worked. I am mostly stressed about the shapes.

Will be grateful for any help, thanks !

nn.BCEWithLogitsLoss expects the model output to contain logits, so you would have to remove the sigmoid (it’ll be applied internally in a more numerically stable way).
However, this criterion expects the model output and the target to have the same shape, which doesn’t seem to be the case in your workflow.
For a binary segmentation use case, the model should return an output in [batch_size, 1, height, width]. Based on the current shape I assume you are setting the number of output classes to 2 in a multi-class segmentation. If that’s the case, you would have to either change the last layer so that it outputs one class channel or use nn.CrossEntropyLoss alternatively. In the latter case, you would have to remove the channel dimension from the target so that is has the shape [batch_size, height, width] and contains class indices in the range [0, nb_classes-1].

Many thanks for your reply. I am sure it’s some detail i just dont understand.

Alternatively, I did try this the way you say i.e (using the softmax on my output(Shape - [32, 2, 256,256])) followed by the Cross Entropy loss and doing squeeze(1) on my target(Shape - [32, 256, 256])
It works but it doesn’t learn anything, the first output of the segmentation mask before training is already good since i use a transfer learning pretrained model however, after running first iteration and calculating loss, the output has nothing besides the training loss goes down for a few batches and plateaus.

I am quite certain my loss and shapes are just not right, it’ll be awesome if you could help out. I wasted a bright sunny day getting nowehere.

nn.CrossEntropyLoss expects raw logits, so you would have to remove the softmax and rerun the script.