How can I save a tensor that's too big for torch.save?

As an example, the following code will fail when you try to run it:

import torch
x = torch.zeros(2147483648) 
torch.save(x, 'fails_to_save.pt')

The error that the above code creates:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\PC\AppData\Local\Programs\Python\Python37\lib\site-packages\torch\serialization.py", line 372, in save
    _save(obj, opened_zipfile, pickle_module, pickle_protocol)
  File "C:\Users\PC\AppData\Local\Programs\Python\Python37\lib\site-packages\torch\serialization.py", line 487, in _save
    zip_file.write_record(name, storage.data_ptr(), num_bytes)
TypeError: write_record(): incompatible function arguments. The following argument types are supported:
    1. (self: torch._C.PyTorchFileWriter, arg0: str, arg1: str, arg2: int) -> None
    2. (self: torch._C.PyTorchFileWriter, arg0: str, arg1: int, arg2: int) -> None

Invoked with: <torch._C.PyTorchFileWriter object at 0x000001BDD3B083B0>, 'data/1914873411776', 1915292139648, -8589934592

The specific tensor size that I got the error with was size: (1281167, 1008, 2).

The issue is tracked here.