hi, i am a beginner of pytorch.
The problem is how to add all the loss which is iterate on the whole dataset.
Some of the code shown below may explain my problem clearly.
# my network class MyNet(nn.Module): ... def forward(self, input): ... return a, b net = MyNet()
# my dataloader for my own dataset train_loader = Dataloader( dataset=train_data, shuffle=True, batch_size=1)
# my training # the optimizer and the criterion has been defined for epoch in range(num_epochs): running_loss = 0.0 optimizer.zero_grad() for i, data in enumerate(train_loader, 0): inputs, labels = data outputs_a, outputs_b = net(inputs) loss_a = criterion(outputs_a, labels) loss_b = criterion(outputs_b, labels) running_loss = running_loss + loss_a.item() + loss_b.item() running_loss.backward() optimizer.step()
i know there must be something wrong with my running_loss. And what i want to do is ,for one epoch, add all the loss of one iteration over the data set and do the backward.
And is the batch_size = 1 correct or not? And how to define the running_loss. In some post, i saw loss.data should be add together?