pytorchcv
(PyTorch Learner)
June 17, 2023, 1:29am
1
idx = 17
model.load_state_dict(torch.load(‘/content/best_model.pt’))
image, mask = validset[idx]
print(image.shape)
logit_mask = model(image.to(DEVICE).unsqueeze(0))
print(logit_mask.shape)
pred_mask = torch.sigmoid(logit_mask)
print(pred_mask.shape)
pred_mask = (pred_mask > 0.5)*1.0
ERROR —>>>
torch.Size([3, 512, 512])
AttributeError Traceback (most recent call last)
in <cell line: 14>()
12 #logit_mask = torch.unsqueeze(image, dim=0)
13 #logit_mask = image.to(DEVICE).unsqueeze(0)
—> 14 logit_mask = model(image.to(DEVICE).unsqueeze(0))
15 print(logit_mask.shape)
16
3 frames
/usr/local/lib/python3.10/dist-packages/segmentation_models_pytorch/losses/dice.py in forward(self, y_pred, y_true)
57 def forward(self, y_pred: torch.Tensor, y_true: torch.Tensor) → torch.Tensor:
58
—> 59 assert y_true.size(0) == y_pred.size(0)
60
61 if self.from_logits:
AttributeError: ‘NoneType’ object has no attribute ‘size’
Can Someone help me out… ???
Check the forward
method of your model as it seems either y_true
or y_pred
is set to None
(or both).
pytorchcv
(PyTorch Learner)
June 18, 2023, 1:30am
3
def forward(self, images, masks = None ):
logits = self.backbone(images)
if mask != None: # then return logits and also calculate Loss
return logits, DiceLoss(mode = 'binary')(logits, masks) + nn.BCEWithLogitsLoss()(logits, masks)
This is runs fine with from where i am learning… but in my Google colab giving above error as i mentioned in my post…
Thanks for your reply…
Add print statements to narrow down which line of code exactly raises the error. As can be seen in the stacktrace the DiceLoss
expects two valid tensor as its input while the masks
seem to use the default None
value.
pytorchcv
(PyTorch Learner)
June 18, 2023, 2:45am
5
if mask != None: ---->>>> Typo error basically,… I have written mask but its actually ‘masks’
Only for this typo got that error… its pathetic…
Thank for your response…
Good catch as I didn’t see it!
1 Like
pytorchcv
(PyTorch Learner)
June 18, 2023, 12:47pm
7
Thank you so much for your concern…