Hi, I have multiple losses.
It worked well on single gpu.
However when I tried it using DataParallel, it stuck on loss.backward().
What should I consider when I use multiple loss on multiple gpus? Below code is a simple flow of my implementation.
criterion = nn.CrossEntropyLoss().cuda() model = DataParallel(model).cuda() model.train() for i, (inputs, labels) in enumerate(train_loader) labels = labels.cuda(non_blocking=True) logits = model(inputs) loss_1 = criterion(logits, labels) loss_2 = criterion(logits, labels) loss = loss_1 + loss_2 loss.backward()
Thanks for your advice in advance!