'the derivative for 'weight' is not implemented' in BCELoss

I implemented binary_cross_entropy_with_logits(x,t,w).
The type of x is torch.Tensor().float() whose requires_grad is True, and is_cuda is True,
the type of y is torch.Tensor().long() whose requires_grad is False and is_cuda is True,
and the type of w is torch.Tensor().float() whose requires_grad is True, and is_cuda is True.

Pytorch 1.0.1 will generate bug that ‘the derivative for ‘weight’ is not implemented’, but pytorch 0.3.1 will not. So I’m confused that if pytorch supports the derivative for weight in version 0.3.1?
And how can I solve it if requires_grad of w is True?

Hello,

Here is an explanation, may this can help you.

1 Like