Loss decrease a lot at end of 1st Epoch. is this fine or mistake

for epoch in range(EPOCH):

        for i, data in enumerate(train_loader, 0):

            net.train()

            model_out = net(
                Variable(data["left_aug"]), Variable(data["right_aug"])
            )
            REC_loss, disp_smooth_loss, lr_loss = get_loss(model_out)
      
            optimizer.zero_grad()
            loss.backward()
            optimizer.step()

What is the x axis? is it epochs? or iterations?

I would say, yes it’s possible, it depends on a few things though

  1. If x axis is iterations, and you calculate the total mean of the loss from the start until that iteration, so when the epoch is done and you start a new epoch, you start calculating a new mean, so you don’t drag your history from your previous epoch, that’s why you see this big drop.

  2. It could be that your data loader has few batches of “easy” samples at the end

  3. Try shuffle the data and post the plot of the result, is it reproducing?

Roy

1 Like

x is iteration

  1. I calculate mean of loss

now I got reason

thanks