Help with debugging - ValueError: optimizer got an empty parameter list

For this NLHF code, I have a noob question where my eyes / mind could not catch where went wrong at this moment yet.

No parameters in params.

Traceback (most recent call last):

File “/Users/john/Downloads/nlhf.py”, line 567, in

optimizer_current_policy = AdamW_on_Lion_Optimizer(

File “/Users/john/Downloads/nlhf.py”, line 534, in init

self.adamW = optim.AdamW(params=params, lr=lr, betas=adam_betas,

File “/Users/john/.nlp/lib/python3.9/site-packages/torch/optim/adamw.py”, line 52, in init

super().init(params, defaults)

File “/Users/john/.nlp/lib/python3.9/site-packages/torch/optim/optimizer.py”, line 261, in init

raise ValueError(“optimizer got an empty parameter list”)

ValueError: optimizer got an empty parameter list

Note: Before params is passed into AdamW_on_Lion_Optimizer(), params is a valid non-empty parameter list.

How did you create params?

See lines 567 and 577 of the code

Using line 577 does not cause the problem, but line 567 has the above problem.

@ptrblck if you add the following to just before line 567 : optimizer_current_policy = AdamW_on_Lion_Optimizer( params=current_policy.parameters(), lr=1e-3 ) , you will notice that params is a valid non-empty parameter list before it is passed into AdamW_on_Lion_Optimizer()

     # Diagnostic code to check parameters
     if not list(current_policy.parameters()):
         print("No parameters in current_policy.")
     else:
         for idx, param in enumerate(current_policy.parameters()):
             print(f"Param {idx}: requires_grad={param.requires_grad}")
             if param.requires_grad:
                 print("There are parameters that require gradients.")
             else:
                 print("No parameters require gradients.")

params is not actually empty when before it is first being passed into the AdamW_on_Lion_Optimizer() init function as current_policy.parameters() , so I am really confused on why it became empty when it went inside the init function.

@ptrblck : I wonder if you have any idea / comments / suggestions OR experienced something similar in your past experience ?