CudaBCECriterion_updateOutput with Variable weight

TypeError: CudaBCECriterion_updateOutput received an invalid combination 
of arguments - got (int, torch.cuda.FloatTensor, torch.cuda.FloatTensor, 
torch.cuda.FloatTensor, bool, !Variable!), but expected (int state, 
torch.cuda.FloatTensor input, torch.cuda.FloatTensor target, 
torch.cuda.FloatTensor output, bool sizeAverage, 
[torch.cuda.FloatTensor weights or None])

does this mean that we can not backprop through weights, or I am doing something wrong?

Can you post your code? That error is saying the weights are being passed to the C/Cuda level as a Variable instead of a tensor.

Right. I want these weights to be a function of my data and current state.

weights = ae_class_dist[:, class_i]
weight_cat = torch.cat([Variable(torch.ones(features_a_i.size(0))).cuda(),
                        weights], 0)
cross_ent = F.binary_cross_entropy(F.sigmoid(output.view(-1)),
                                   full_y.view(-1), weight=weight_cat.data)
total_dist += cross_ent

and then optimize total_dist. According to docs, it seems possible

    weight (Variable, optional): a manual rescaling weight
            if provided it's repeated to match input tensor shape

Thanks!

(O

Try passing every parameter to binary_Cross_entropy as a Variable; I don’t know what the types of output or full_y are, but weight_cat.data is definitely a tensor.

Sure, sorry. It does run with .data and gives error above without. output and full_y are Variables. My question was basically - does it suppose to work, or passing Variable as weights is just not implemented? Thanks.

yes it works. Here’s something close-ish to what you are trying:

F.binary_cross_entropy(Variable(torch.rand(3,4), requires_grad=True), Variable(torch.randn(3,4), requires_grad=True), weight=Variable(torch.randn(4,4), requires_grad=True)[:,3])

Can you come up with a minimal example that demonstrates your issue?