Custom loss error [grad can be implicitly created only for scalar outputs]

Hey, guys.
I could not understand why this code is giving an error. Please anyone help?
It is not giving error when i use nn.MSELoss().

for epoch in range(1):
    for i, (A,B,labelA,labelB) in enumerate(training_generator):
        A = Variable(A.unsqueeze(0).float())
        B = Variable(B.unsqueeze(0).float())
        label = Variable((labelA==labelB).float())
        A.requires_grad=True
        B.requires_grad=True
        label.requires_grad=True
        
        
        pha, phb = model_lstm((A,B))
        pha = pha.squeeze()
        phb = phb.squeeze()
        
        out_warper = model_warper(torch.cat((pha,phb),1))
        
        Sab = torch.abs(pha-phb).mean(1).unsqueeze(1)
        Sab = torch.exp(-Sab*out_warper)
        
        
        loss = label*nn.Sigmoid()(Sab) + (1-label)*nn.Sigmoid()(1-Sab)
        
        
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

Error:

RuntimeError: grad can be implicitly created only for scalar outputs

Try using sum or mean to get a single number. (Print the loss, too)

1 Like

Thanks, somehow it worked using loss = nn.BCELoss since Sab is already single number.