TypeError - Torch.Device turning into RepeatedScalarContainer?

I’m using Unity’s ML-Agents. Getting an error TypeError: new() received an invalid combination of arguments - got (google.protobuf.pyext._message.RepeatedScalarContainer, int)

Traceback (most recent call last):
  File "train.py", line 87, in <module>
    agent_1 = Agent(state_size=48, action_size=action_size, random_seed=0)
  File "C:\Users\Tester\ml-agents\ml-agents\mlagents\trainers\ddpg\ddpg_agent.py", line 39, in __init__
    self.actor_local = Actor(state_size, action_size, random_seed).to(device)
  File "C:\Users\Tester\ml-agents\ml-agents\mlagents\trainers\ddpg\model.py", line 29, in __init__
    self.fc3 = nn.Linear(fc2_units, action_size)
  File "C:\Users\Tester\AppData\Local\conda\conda\envs\ml-agents\lib\site-packages\torch\nn\modules\linear.py", line 51, in __init__
    self.weight = Parameter(torch.Tensor(out_features, in_features))
TypeError: new() received an invalid combination of arguments - got (google.protobuf.pyext._message.RepeatedScalarContainer, int), but expected one of:
 * (torch.device device)
 * (torch.Storage storage)
 * (Tensor other)
 * (tuple of ints size, torch.device device)
      didn't match because some of the arguments have invalid types: (e[31;1mgoogle.protobuf.pyext._message.RepeatedScalarContainere[0m, e[31;1minte[0m)
 * (object data, torch.device device)
      didn't match because some of the arguments have invalid types: (e[31;1mgoogle.protobuf.pyext._message.RepeatedScalarContainere[0m, e[31;1minte[0m)

I’ve tested the device variable that I’m passing through and confirmed that its of the class torch.device. What is a RepeatedScalarContainer? Why could this be happening

As shown in the error message

TypeError: new() received an invalid combination of arguments - got (google.protobuf.pyext._message.RepeatedScalarContainer, int), but expected one of:

google.protobuf.pyext._message.RepeatedScalarContainer should be an Int. As far as what it means, I am not sure, it depends on the api you are using. But the main problem is you are not giving input as Int to the layer nn.Linear