Convert .pt to json

Hi,
I am super new to all this. I used this project steerable-nafx/steerable-nafx.ipynb at main · csteinmetz1/steerable-nafx · GitHub to copy the effects of one of my guiar pedals and as a result I got a .pt file.

Now I am trying to to use this other repo to play my model in real time. But the input is a json file

Any help pointing in the right direction will be appreciate it.

The second repository seems to use Keras so I’m unsure how it would fit into the use case or deploying your PyTorch model. Could you describe the use case a bit more and what the JSON encoding would be used for?

What I am trying to achieve here is to emulate an analog guitar pedal through the use of machine learning.
Here is what I did so far

  • Recorded myself playing guitar and split the channel (1 clean, 1 through the analog reverb pedal)
  • As a result I ended up with 2 wav files
  • I compiled the 1st repo on google collab and uploaded both files (clean/input file, target/output file)
    ** When the code finished running, I got a .pt file
    ** torch.save(model, "./reverb_full.pt") files.download("./reverb_full.pt")
  • Moving on to repo #2. I compiled this repo on my own computer with anaconda.
    ** This repo basically takes a json file (model/network weights) and creates a VST3 plugin to later on use it on a DAW software (pro tools for example, or any other software to record music)

I asked the owner of the second repo how to convert .pt files to an acceptable json formatted file and this is what he said How do I transform .pt files to json ? · Issue #46 · jatinchowdhury18/RTNeural · GitHub
Since I am fairly new with all this, maybe the response makes more sense to you.
Thank you so much for taking the time to even look into this!!

This is an example of the json file the second repo expects RTNeural-example/neural_net_weights.json at main · jatinchowdhury18/RTNeural-example · GitHub

Thanks for the explanation! Your project sounds really interesting.

Based on your description it seems that you are trying to use a C++ library which expects the model parameters in the JSON format.
If so, the author is right and I would also try to convert the model.state_dict() to this format by manually parsing it.

@creativeguitar Agree with @ptrblck . This sounds interesting. Please keep the forum posted on your progress :grinning:

Made some progress.
I was able to convert the model to an ordereddict

I managed to extract the model weights from the .pt file. Is the information in this file in the correct format now?

model_verb = torch.load("reverb_full.pt", map_location="cuda").eval()
model_verb.state_dict()
reverb_full.pt
              OrderedDict([('blocks.0.conv.weight',
              tensor([[[-0.1298,  0.0699, -0.4024, -0.4144, -0.6612, -0.3614, -0.2711,
                         0.1121,  0.3811,  0.4887,  0.6554, -0.4438,  0.3937]],
              
                      [[ 0.1641,  0.1940,  0.2088, -0.0812, -0.1690, -0.3806, -0.4686,
                        -0.4721, -0.5710, -0.3671, -0.3867,  0.6685, -0.0490]],
              
                      [[ 0.4627,  0.1778,  0.4389, -0.0471,  0.0029, -0.4056, -0.3544,
                        -0.2737, -0.2220,  0.4237,  0.3140, -0.0832,  0.1095]],
              
                      [[ 0.1386, -0.0815, -0.0198, -0.5040, -0.0545, -0.0723,  0.3019,
                         0.4373,  0.1611,  0.4337, -0.1162, -0.2702,  0.1192]],
              
                      [[ 0.0046,  0.4190,  0.3304,  0.3830,  0.2753,  0.0691, -0.0585,
                        -0.4008, -0.3159, -0.5296, -0.0253,  0.2469,  0.4646]],
              
                      [[ 0.3359,  0.6874,  0.4409,  0.1396,  0.1342,  0.0766,  0.4255,
                         0.2557,  0.6377,  0.1431,  0.2185, -0.0497, -0.2628]],
              
                      [[ 0.2687,  0.5740,  0.5573,  0.3525, -0.0186, -0.1923, -0.3670,
                        -0.2073, -0.3086, -0.1535, -0.4035,  0.5661, -0.4399]],
              
                      [[-0.0348,  0.2481,  0.3841,  0.5633,  0.3016,  0.6492,  0.0361,
                         0.3763, -0.2285,  0.0285, -0.0704, -0.3581,  0.4612]]])),
             ('blocks.0.conv.bias',
              tensor([-0.1317,  0.0801, -0.0657,  0.0111, -0.2019,  0.0651, -0.1624,  0.0881])),
             ('blocks.0.film.adaptor.weight', tensor([[-0.1962, -0.4742],
                      [ 0.0118,  0.2025],
                      [-0.0262, -0.5196],
                      [-0.7068,  0.0866],
                      [ 0.3649,  0.2775],
                      [ 0.1817,  0.3868],
                      [ 0.4642,  0.1849],
                      [ 0.5937, -0.4390],
                      [ 0.2238,  0.0662],
                      [ 0.1939,  0.0262],
                      [-0.4839, -0.3618],
                      [ 0.4020, -0.4487],
                      [ 0.1656,  0.2309],
                      [ 0.0682, -0.2875],
                      [ 0.0196, -0.0321],
                      [-0.6210,  0.5165]])),
             ('blocks.0.film.adaptor.bias',
              tensor([ 0.6033, -1.1071, -0.5714, -0.4874,  0.5016,  0.3080, -0.7018,  0.4283,
                      -0.1645, -0.7909, -0.0444,  0.4886, -0.1715,  0.5332,  0.4520,  0.7016])),
             ('blocks.0.act.weight', tensor([1.0010])),
             ('blocks.0.res.weight', tensor([[[ 0.5295]],
              
                      [[ 0.8668]],
              
                      [[-0.4472]],
              
                      [[-0.2738]],
              
                      [[ 0.0289]],
              
                      [[-0.0996]],
              
                      [[ 0.7559]],
              
                      [[ 0.2094]]])),
             ('blocks.1.conv.weight',
              tensor([[[-7.7118e-02,  8.3447e-02,  4.0766e-02,  1.0021e-01, -1.1097e-01,
                        -1.1143e-01,  1.7931e-01,  1.1139e-01,  1.7485e-01,  3.9384e-01,
                         3.1975e-01,  8.2028e-02, -3.1758e-02],
                       [ 5.2299e-02, -2.8190e-02, -6.4530e-02, -1.2040e-01, -6.6333e-02,
                        -5.0940e-02,  1.7104e-03,  1.6328e-01,  5.0179e-01,  3.3963e-02,
                         3.1775e-01, -2.9564e-02,  9.1750e-02],
                       [-9.0324e-02, -1.2507e-01, -2.4028e-03,  1.9625e-01,  1.4102e-02,
                         1.2354e-01, -7.0674e-03,  1.6727e-01, -4.3540e-02, -3.8468e-02,
                         5.8582e-02,  9.5664e-02, -2.1854e-01],
                       [ 4.7151e-02, -1.0220e-02, -2.5507e-02, -5.0436e-02,  7.4724e-02,
                        -1.5879e-02,  6.3913e-02, -1.9377e-01, -1.3045e-01, -8.6763e-02,
                        -2.4228e-01, -1.5414e-01, -5.2916e-03],
                       [ 1.2142e-01,  1.9297e-01,  7.4695e-02, -1.5313e-01, -1.0863e-01,
                        -1.1971e-02, -1.1517e-01, -3.2580e-01, -2.2143e-01, -8.9785e-02,
                        -1.3874e-01, -3.0188e-02, -1.7816e-01],
                       [-5.5444e-02, -1.6797e-01,  8.0191e-02, -9.3637e-02, -8.5481e-02,
                        -9.3189e-02, -1.5959e-01,  1.6686e-01,  1.9165e-02,  1.2173e-01,
                        -1.1160e-01, -8.7511e-02, -9.3956e-02],
                       [-7.4850e-02,  1.8889e-02, -1.8180e-01,  6.6027e-02,  1.1201e-01,
                        -3.8151e-02,  6.2812e-02, -1.0170e-01,  4.5755e-01, -1.6868e-01,
                         4.5296e-02, -3.9180e-02, -1.7269e-01],
                       [-5.1820e-02, -1.5111e-01, -4.1044e-02, -2.0386e-01, -7.6540e-02,
                        -1.7273e-01, -6.0671e-02,  6.4420e-03,  9.3887e-02,  2.3023e-02,
                         1.4818e-02, -4.1135e-02,  3.2534e-03]],
              
                      [[ 2.4877e-01,  3.7433e-01, -2.8565e-02, -7.1523e-02, -5.1796e-02,
                        -2.0912e-02,  1.1077e-01,  4.3306e-03,  1.3783e-01,  3.1706e-01,
                        -2.6163e-02, -3.7166e-01,  6.1846e-02],
                       [-1.5473e-01,  1.8668e-02,  1.3543e-02,  9.9810e-02, -1.6062e-01,
                        -2.7790e-01, -1.8959e-01, -4.4618e-03,  1.5717e-01, -1.3372e-01,
                         4.5878e-02, -3.1708e-01,  1.3398e-01],
                       [ 1.8705e-01, -3.2216e-02, -1.0383e-01, -1.7633e-01,  3.4827e-02,
                         1.1517e-01,  9.1034e-02, -3.0097e-02, -4.7922e-02,  1.6160e-01,
                         2.0551e-01,  2.4782e-01,  6.1695e-01],
                       [ 1.7939e-01, -1.3421e-01, -9.1555e-02, -6.3503e-02,  1.9247e-02,
                         1.0486e-01, -1.9626e-02,  2.2244e-01, -2.6803e-01, -8.8263e-02,
                         4.4043e-02, -1.6127e-01, -1.1129e-01],
                       [ 2.0268e-02, -3.6074e-01,  1.6279e-01,  2.2947e-01, -4.9506e-02,
                        -5.7764e-02, -8.1883e-02,  1.3576e-02, -2.1019e-01, -3.0769e-01,
                        -1.4614e-01,  1.3390e-01,  9.9876e-02],
                       [-3.3786e-01, -2.1277e-01, -5.0107e-02,  1.2277e-01,  3.5645e-02,
                        -1.9554e-01, -2.9955e-01,  3.2802e-02, -1.8622e-02, -1.5235e-01,
                         1.1373e-01, -4.2711e-01, -1.6299e-01],
                       [-8.6759e-02,  8.7027e-02,  9.5962e-03, -1.6813e-01, -1.0420e-02,
                         8.1293e-02, -6.2374e-02, -2.5756e-03, -8.2061e-03,  4.7230e-02,
                         1.6142e-01,  9.9275e-02, -2.0132e-01],
                       [-3.7854e-01,  7.2318e-02, -2.3543e-02,  9.0526e-02, -6.8289e-02,
                        -1.4311e-01, -1.6145e-02,  8.7110e-02,  6.3401e-02, -8.6678e-02,
                        -2.9887e-02, -1.9675e-01, -8.4255e-02]],
              
                      [[ 7.0295e-02,  1.2211e-01,  2.4917e-01,  6.7158e-02,  2.3083e-01,
                         2.5574e-01,  2.6019e-01,  1.0869e-02, -4.0891e-01, -1.7995e-01,
                        -1.9274e-01, -7.6871e-02,  2.4176e-01],
                       [ 3.4064e-01,  1.4763e-01,  2.1635e-01,  5.1499e-02, -5.7420e-02,
                         1.3587e-01,  4.1135e-02, -1.6955e-01,  2.1021e-02,  3.2060e-01,
                        -3.1013e-02, -9.5046e-02,  2.5616e-01],
                       [-4.2208e-02, -9.4603e-02, -2.3270e-01, -2.0234e-01,  6.6833e-03,
                        -8.6855e-02, -1.6360e-01, -1.4886e-01,  4.9193e-02,  2.2812e-02,
                        -2.0007e-01, -2.3832e-02,  3.2806e-01],
                       [-1.1941e-01, -6.7837e-02, -1.5781e-01, -1.3439e-01,  6.2534e-03,
                        -8.0717e-02, -7.8080e-02,  7.5875e-02,  5.0799e-02, -4.6651e-02,
                         1.8118e-01,  2.4067e-02, -1.4432e-02],
                       [ 7.1198e-02, -2.6421e-02,  1.3267e-01,  7.9205e-03, -3.2077e-02,
                         1.4993e-01,  2.0277e-01,  4.2164e-01,  3.3229e-01,  5.8825e-02,
                         4.0194e-01,  3.6094e-01, -2.0041e-02],
                       [ 1.2827e-01, -1.2717e-01,  3.4417e-02,  2.0028e-01,  9.4749e-02,
                        -8.7568e-02,  5.3344e-02,  9.0669e-02,  2.4654e-01, -5.8251e-02,
                         2.0593e-01,  1.5784e-01,  1.3699e-01],
                       [-1.2355e-01, -6.0096e-02, -5.4322e-02,  3.3811e-02,  1.0688e-02,
                         1.3268e-02, -4.7507e-02, -3.4242e-02, -4.9244e-01,  2.3481e-01,
                        -9.5430e-02, -1.2007e-01, -1.5129e-02],
                       [ 1.9327e-01,  7.2998e-02, -5.4335e-02,  1.0879e-01, -1.8569e-02,
                        -4.5218e-02,  8.2211e-02, -2.6313e-02,  8.0196e-02,  1.5123e-01,
                         2.4969e-01,  2.4261e-02, -9.3883e-02]],
              
                      [[ 2.7044e-01, -4.4969e-01, -1.6666e-01, -2.6497e-01,  5.7645e-01,
                         2.0759e-01,  3.3068e-01,  2.9487e-01,  9.7794e-02,  2.5556e-01,
                        -3.5541e-01, -3.4951e-01, -8.3556e-01],
                       [-2.6069e-01,  3.1115e-01,  3.5495e-01,  8.8355e-02,  4.8561e-02,
                         1.2516e-01,  2.6930e-01, -5.6581e-01, -1.7098e-01,  1.8108e-01,
                        -1.8756e-01,  6.6137e-02, -1.2789e-01],
                       [ 4.2592e-01,  1.8713e-01, -3.2536e-02,  1.8321e-01, -5.8510e-03,
                         1.4366e-01, -9.2859e-02,  1.1593e-01,  3.1847e-01,  4.9595e-01,
                         4.8627e-01,  5.9260e-01, -7.5621e-01],
                       [-2.1626e-01, -1.5242e-01, -4.2155e-01, -3.3769e-01,  8.0346e-02,
                        -3.1913e-01, -1.1539e-01,  8.5033e-02,  4.8322e-01,  1.2597e-01,
                        -1.9418e-01,  2.7334e-01, -8.4192e-02],
                       [-4.7208e-01,  1.4594e-01, -1.9303e-01, -3.8485e-01, -2.2596e-01,
                        -3.0036e-01, -9.1248e-02,  7.3560e-02,  1.7228e-01,  8.5677e-02,
                        -3.8423e-02,  2.1600e-01, -9.3320e-02],
                       [-3.4113e-01, -2.0200e-02,  1.5072e-01, -2.7774e-01, -4.4605e-01,
                         5.7131e-02, -4.0423e-01, -2.1992e-01, -7.4423e-02,  1.5540e-03,
                        -3.4256e-01,  1.3901e-01, -1.3820e-01],
                       [-1.7193e-01,  2.5362e-01, -2.7563e-01,  3.3381e-01, -9.4688e-02,
                         2.3650e-02,  1.0144e-01,  5.4558e-02, -1.3107e-01,  7.1892e-03,
                        -2.0494e-01,  3.3572e-02, -1.3990e-01],
                       [-4.7345e-01, -2.8328e-01, -1.8328e-02, -2.7507e-01, -2.8603e-01,
                        -2.4580e-02, -9.7987e-02, -4.0284e-01, -4.8121e-02,  1.4557e-03,
                         1.1749e-02, -1.1105e-04, -2.1119e-01]],
              
                      [[-1.7054e-01,  2.2629e-01,  3.3506e-02, -2.2939e-01, -5.5154e-01,
                        -2.0296e-01, -1.3050e-01,  4.7344e-02, -1.6214e-01, -6.0605e-01,
                         1.6213e-01,  1.9646e-01, -5.8959e-01],
                     ....
                       [ 0.3206],
                       [ 0.3733],
                       [ 0.2837],
                       [-0.0026],
                       [-0.2636],
                       [-0.0969],
                       [ 0.1241]],
              
                      [[ 0.1754],
                       [-0.1104],
                       [-0.2569],
                       [-0.4021],
                       [-0.2580],
                       [ 0.2359],
                       [ 0.0509],
                       [-0.0661]],
              
                      [[ 0.2241],
                       [-0.0666],
                       [-0.3219],
                       [-0.3639],
                       [ 0.0821],
                       [ 0.2402],
                       [ 0.1352],
                       [-0.0578]],
              
                      [[-0.1299],
                       [ 0.1841],
                       [-0.1881],
                       [-0.2539],
                       [-0.1086],
                       [ 0.0413],
                       [-0.2463],
                       [-0.2694]],
              
                      [[-0.2778],
                       [ 0.2714],
                       [ 0.1492],
                       [ 0.3711],
                       [ 0.2107],
                       [-0.0703],
                       [ 0.1655],
                       [ 0.0572]]])),
             ('blocks.4.conv.weight',
              tensor([[[ 0.0296,  0.0557, -0.1451,  0.0341,  0.1050, -0.1492, -0.0750,
                        -0.1599,  0.1910, -0.2220, -0.0977, -0.1056,  0.0507],
                       [-0.1125, -0.1840,  0.0892,  0.0035,  0.0436, -0.2446,  0.1554,
                        -0.0668, -0.0590, -0.0521,  0.0253,  0.1049,  0.0643],
                       [-0.2005,  0.0047,  0.0133, -0.0830,  0.0705,  0.2304, -0.5210,
                         0.1070, -0.0858, -0.2326, -0.1822,  0.2396,  0.2717],
                       [ 0.0448,  0.1796, -0.1071,  0.1064,  0.0658, -0.2038,  0.0239,
                         0.1440,  0.3935,  0.0065,  0.0390, -0.0515, -0.0550],
                       [ 0.0437, -0.0964, -0.0884,  0.2518, -0.1431, -0.1283, -0.2663,
                         0.1250, -0.1318,  0.0853,  0.2320,  0.0187,  0.0087],
                       [-0.2744, -0.0103,  0.1717, -0.1905,  0.0357, -0.0739, -0.1197,
                         0.1075,  0.1777,  0.2062, -0.2827,  0.3773,  0.0940],
                       [ 0.0672,  0.0666, -0.0710, -0.1071, -0.0116,  0.1229, -0.0914,
                        -0.1270, -0.0077,  0.3019, -0.2137, -0.2694, -0.1641],
                       [ 0.2700, -0.0170, -0.1053,  0.0857, -0.2341, -0.0246,  0.0311,
                        -0.3872,  0.0005,  0.2699, -0.1138,  0.2301,  0.2700]]])),
             ('blocks.4.conv.bias', tensor([-0.1231])),
             ('blocks.4.film.adaptor.weight', tensor([[ 0.0314, -0.0043],
                      [-0.6116, -0.5727]])),
             ('blocks.4.film.adaptor.bias', tensor([-0.5461, -0.2179])),
             ('blocks.4.act.weight', tensor([0.9973])),
             ('blocks.4.res.weight', tensor([[[-0.2765],
                       [ 0.2469],
                       [ 0.2226],
                       [ 0.3452],
                       [-0.0464],
                       [ 0.0720],
                       [-0.1452],
                       [ 0.1538]]]))])

I am not trying to go from ordereddict to json. I followed this tutorial How to convert Ordereddict to JSON? - GeeksforGeeks but I am getting an error when trying to convert.

from collections import OrderedDict
import json

model_verb_cuda = torch.load("reverb_full.pt", map_location="cuda").eval()
od1 = model_verb_cuda.state_dict()
od1= json.dumps(od1)
# Using a JSON string
with open('json_data.json', 'w') as outfile:
    outfile.write(od1)
/usr/lib/python3.7/json/encoder.py in default(self, o)
    177 
    178         """
--> 179         raise TypeError(f'Object of type {o.__class__.__name__} '
    180                         f'is not JSON serializable')
    181 

TypeError: Object of type Tensor is not JSON serializable

is it because I am dealing with a nested ordereddict?

No, the error is raised as the tensor type isn’t serializable to JSON directly.
You could create lists with the values first and serialize it afterwards:

model = nn.Linear(10, 10)
od1 = model.state_dict()
od1 = OrderedDict({k: od1[k].detach().cpu().tolist() for k in od1})
od1 = json.dumps(od1)

i would like to ask you how could i send this model across websockets i did similar code but started training did not work i am suing FastAPI
Client-side

async def connect(self):
        uri = f"ws://{self.server}/ws/{self.client}"
        async with websockets.connect(uri) as websocket:
            self.websocket = websocket
            await self.register_handles()
            while True:
                data = await websocket.recv()
                data = json.loads(data)
                print(data)
                if data["event"] == "start_training":
                    await self.start_training(data["data"])

Server-side

 async def start_round(self, websocket):
        global connected_nodes, pending_nodes
        print(f'Starting round {self.round + 1}')
        pending_nodes = connected_nodes.copy()
        model_state_dict = self.global_model.state_dict()
        model_json = OrderedDict({k: model_state_dict[k].detach().cpu().tolist() for k in model_state_dict})
        global_model_json = json.dumps(model_json)
        print(global_model_json)
        model_weights = encode_layer(model_state_dict)

        await websocket.send_json({
            "event": "start_training",
            "data": {
                "model_architecture": global_model_json,
                "model_weights": model_weights,
            }
        })

Good luck on your project.