Hi I have a question about DDP.
When I use DDP with dynamically freezing or unfreezing my model’s parameters during the training, one option to make a non-error function is to use
find_unused_parameters. But when there are no parameters that are unused during backpropagation, a warning occurs as below.
[W reducer.cpp:1050] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters, consider turning this flag off. Note that this warning may be a false positive your model has flow control causing later iterations to have unused parameters. (function operator())
Is there any way to disable
find_unused_parameters during the process?
P.S) Using pytorch_lightning is not the option for me now