Torch assign independent tasks to different gpus

i have a script in which train two models in an independent way, on the same dataset and using the same initialization

pseudo code would read something like

  1. initialize warm-up model
  2. train and store warm-up model
  3. load warm-up model and train using 1st strategy
  4. load warm-up model and train using 2nd strategy
  5. compare performances

is there a way to assign in the script the training to different GPUs so that steps 3&4 happen in parallel without calling a different .py file for each step?

I am not aware of any way to do this in PyTorch. However, it seems like your usecase is very easily fixed writing a bash script. You can use the same .py script using different arguments and call it in a bash script as such:

python --warmup --model_out "weights.ckpt" # warmup
python --strategy_1 --model_in "weights.ckpt" --model_out "strategy_1.ckpt"
python --strategy_2 --model_in "weights.ckpt" --model_out "strategy_2.ckpt"
python --compare --models_to_compare  "strategy_1.ckpt" "strategy_2.ckpt"

Then you just run the above script with bash
Hope this helps!

that seems to be the only way to do it, but the problem is that i want to call the script many times and with different values of hyperparameters, so it gets way too complicated…