PyTorch Forums
Is the loss function paralleled when using DataParallel?
smth
May 28, 2017, 5:27pm
2
you can wrap the loss function inside a DataParallel too if you’d like.
How to use DataParallel in backward?
show post in topic