Need help with Optimizer Error!

I am finetuning a gpt-like model with Opacus. But got an error with the optimizer

Traceback (most recent call last):
  File "", line 50, in <module>
  File "", line 47, in main
  File "/code/src/", line 19, in finetune
  File "/opt/conda/lib/python3.7/site-packages/opacus/optimizers/", line 513, in step
    if self.pre_step():
  File "/opt/conda/lib/python3.7/site-packages/opacus/optimizers/", line 494, in pre_step
  File "/opt/conda/lib/python3.7/site-packages/opacus/optimizers/", line 404, in clip_and_accumulate
    per_sample_norms = torch.stack(per_param_norms, dim=1).norm(2, dim=1)
RuntimeError: stack expects each tensor to be equal size, but got [32] at entry 0 and [1] at entry 234

But it works fine when I don’t use Opacus. Could anyone help me?