Loss backward for 3D data (Fully ConvNet for segmentation)

Update:

I actually found loss.sum().backward() worked.

or can use:

h = loss.size()[1]
w = loss.size()[2]
loss.backward(torch.Tensor(1, h, w))

Btw, is loss.sum().backward() better or worse compared to loss.backward(torch.Tensor(1, h, w))?

Thanks

============================

Hi, I’m using LogSoftmax(dim=1) for 4D input (batch_size, num_classes, height, width), and with NLLLoss, and generated a 3D tensor.

But when calling loss.backward(), I got error:
‘’
RuntimeError: grad can be implicitly created only for scalar outputs
‘’
from the line loss.backward().

Can anyone help?

logsoftmax = nn.LogSoftmax(dim=1).cuda()
criterion = torch.nn.NLLLoss(reduce = False).cuda()

outputs = logsoftmax(outputs)
loss = criterion(outputs, labels)
loss.backward()

Thanks.

You could change the criterion to reduce=True or call loss.mean().backward().
In your above code, you are calling loss.backward with a random Tensor, since torch.Tensor(1, h, w) does not initialize the Tensor with any specific values.
You should change it to torch.ones in case you want to pass an argument to the backward function.

I see. didn’t know there are so many ways to do it , Thanks!

Is anyone better than the others? at least in FCN segmentation

I would just use reduce=True and call loss.backward().
It’s the “default” way in my opinion.