[resolved]Loss not decreasing

Hi, I want to know why the loss is not decreasing? Bascially it’s always the initial loss. Thanks in advance!
optimizer.zero_grad()

                    # forward
                    output = net(imgdata)
            
                    _, preds = torch.max(output.data,1)
                 
                    for i in range(output.size()[1]):
                        loss = criterion(output[:,:,:,i],labels.type(torch.LongTensor)[:,:,i])
                    print('Loss per batch ', loss.data[0])

                    if phase == 'train':
                        loss.backward()
                        optimizer.step()
                    
                    next = batch_indices
                   
                    running_loss += loss.data[0]

I’m not sure if this is going to solve your entire problem but one error is that you set the variable i with the 1st dimension of output but then use it to iterate over the third dimension.

you should either do:

for i in range(output.size()[1])
    loss = criterion(output[:, i], labels.type(torch.Longtensor)[i]

or

for i in range(output.size()[3])
    loss = criterion(output[:, :, :, i], labels.type(torch.Longtensor)[:, : i]

Your are right. Thanks for pointing that out!