Trying to load a torch model via Dropbox

I’m trying to find a way to download a saved pytorch model hosted on Dropbox and load it into pytorch. I can’t save or write a file to the server that is running pytorch. I tried to download the torch model with the code below and got a attribute error. Any other ideas do load a model without writing to a file?

        dropbox_url_act = "https://www.dropbox.com/path_here?&dl=1"
        dropbox_url_cri = "https://www.dropbox.com/path_here?&dl=1"
        
        req = requests.get(dropbox_url_act)
        actor = pickle.loads(req.content)
        
        req = requests.get(dropbox_url_cri)
        critic = pickle.loads(req.content)
        
        agent.actor_local.load_state_dict(torch.load(actor))
        agent.critic_local.load_state_dict(torch.load(critic))

AttributeError : ‘int’ object has no attribute ‘seek’. You can only torch.load from a file that is seekable. Please pre-load the data into a buffer like io.BytesIO and try to load from it instead.

Also, when trying to use torch.as_tensor, I get the following error.

During the algorithm initialization, the following exception has occurred: RuntimeError : Overflow when

unpacking long
  at Initialize in main.py:line 26
 :: self.agent.actor_local.load_state_dict(torch.as_tensor(actor))
 RuntimeError : Overflow when unpacking long

I found a solution using BytesIO with torch.load()

Hi, how exactly did you solve it? :slight_smile:

    from io import BytesIO
    
    '''Self.Download is a function that allows me to download from dropbox"""
    act_b64_str = self.Download("enter_url_here")
    cri_b64_str = self.Download("enter_url_here")
    
    # String Encode to bytes
    act_b = act_b64_str.encode("UTF-8")
    
    # Decoding the Base64 bytes
    act_d = base64.b64decode(act_b)

    # String Encode to bytes
    cri_b = cri_b64_str.encode("UTF-8")
    
    # Decoding the Base64 bytes
    cri_d = base64.b64decode(cri_b)

    self.agent.actor_local.load_state_dict(torch.load(BytesIO(act_d), map_location=lambda storage, loc: storage))
    self.agent.critic_local.load_state_dict(torch.load(BytesIO(cri_d), map_location=lambda storage, loc: storage))
1 Like

Its also worth noting that the Pytorch model was converted to Base64 before uploaded on dropbox