Can Torch support training on multiple GPUs which have different memory size?

Few years ago, I remembered that I had trained my model on 2 RTX 3060, and 1 RTX 2060. There was an error raised up.

best regards,
SW.

Yes, PyTorch supports multi-GPU training using different GPUs, but you would have to make sure you are not running out of memory etc. The training will also most likely be bottlenecked by the slowest GPU.

Great and thanks. Let me try.