How to save/load PyTorch model with custom Storage pickling

I’m looking to store PyTorch model checkpoints using the typical format, except I do not want the tensor Storage data to be inside this pickle. I want to control where the tensor data is pickled and stored. (Storage is going to take place via RDMA, avoiding system memory/zipping/etc.). I would like to provide my own pickler to the torch.save() method, except that doesn’t allow me to control the important part (the persistent_id method). How can I store/load tensor Storage data using my own logic while allowing the rest to flow through the normal checkpointing logic?