I have a model split on different gpus and I would like to save it on disk and load it. How could I do that?
I would recommend to transfer the models first to the CPU and store their
state_dict as described here.
This would make sure you can recreate the models and load their
state_dicts on different systems even if they don’t have a GPU.