How to change DDP parameter 'find_unused_parameters'=True to False during training

Hi I have a question about DDP.

When I use DDP with dynamically freezing or unfreezing my model’s parameters during the training, one option to make a non-error function is to use find_unused_parameters. But when there are no parameters that are unused during backpropagation, a warning occurs as below.

[W reducer.cpp:1050] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters, consider turning this flag off. Note that this warning may be a false positive your model has flow control causing later iterations to have unused parameters. (function operator())

Is there any way to disable find_unused_parameters during the process?

P.S) Using pytorch_lightning is not the option for me now :frowning:

1 Like

I’m not sure, if you could completely disable it, but based on the usage of TORCH_WARN_ONCE as seen here I would expect to see this warning only once or are you seeing it in each iteration?

Only once when I unfreeze the rest of the parameters.

By the way, does that also makes performance (i.e accuracy) worse?
I thought only the speed issue will occur, but the metric value drops significantly and then comes up again repeatedly. (or maybe my model is not stable :frowning: )

No, the accuracy should not change only the speed as additional checks are needed.

Thanks for the reply. Right now I use add_param_group in optimizer for unfreezing parameters.
Hope that find_unused_parameters can dynamically updated later in Pytorch

1 Like

This is very annoying, it should be possible to disable this.
What is the workaround to use DDP and only update a subset of parameters?

@dlsf Thanks for the feedback! Is your use case that you would like to train few iterations/epochs with find_unsued_parameters=True, then switch to find_unused_parameters=False for later iterations? If so, one option is to just create a new DDP instance with a different find_unused_parameters value for the remaining iterations.