for epoch in range(EPOCH):
for i, data in enumerate(train_loader, 0):
net.train()
model_out = net(
Variable(data["left_aug"]), Variable(data["right_aug"])
)
REC_loss, disp_smooth_loss, lr_loss = get_loss(model_out)
optimizer.zero_grad()
loss.backward()
optimizer.step()
What is the x axis? is it epochs? or iterations?
I would say, yes itâ€™s possible, it depends on a few things though

If x axis is iterations, and you calculate the total mean of the loss from the start until that iteration, so when the epoch is done and you start a new epoch, you start calculating a new mean, so you donâ€™t drag your history from your previous epoch, thatâ€™s why you see this big drop.

It could be that your data loader has few batches of â€śeasyâ€ť samples at the end

Try shuffle the data and post the plot of the result, is it reproducing?
Roy
1 Like
x is iteration
 I calculate mean of loss
now I got reason
thanks