StopIteration: Caught StopIteration in replica 0 on device 0

Original Traceback (most recent call last):
  File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/parallel/parallel_apply.py", line 61, in _worker
    output = module(*input, **kwargs)
  File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
    return forward_call(*input, **kwargs)
  File "/data_nas/jjrv/P_LayoutLM-CW/model/modeling_bert.py", line 493, in forward
    head_mask=None,cx = cx ,cy = cy,height = height)
  File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
    return forward_call(*input, **kwargs)
  File "/data_nas/jjrv/P_LayoutLM-CW/model/modeling_bert.py", line 336, in forward
    extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype)  # fp16 compatibility
StopIteration

next(self.parameters()) caused the error. I wonder if it’s related to pytorch version?

I don’t think the parameters() method changed in the last year(s) so wouldn’t think the error is caused by an update in the PyTorch version.
Your post doesn’t have any details so properly debug the issue, so I would recommend to check the self object (I assume it should be a model) and check if parameters are registered at all. Empty models would directly raise the StopIteration as seen here:

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        
    def forward(self, x):
        return x

model = MyModel()
print(list(model.parameters()))
# []
next(model.parameters())
# StopIteration