What is the proper way to implement computing the loss iteratively?

Hello,

I am trying to compute loss for my network in the following passion.

loss = 0 
for ...
    loss +=  ...

Is this a proper way to do it? Does it affect backpropagation?

Hi,
I don’t really understand what are you trying to do. Could you explain it in deep?

Do you aim to compute loss for a validation set? To do it while training?

Regards