Original Traceback (most recent call last):
File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/parallel/parallel_apply.py", line 61, in _worker
output = module(*input, **kwargs)
File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "/data_nas/jjrv/P_LayoutLM-CW/model/modeling_bert.py", line 493, in forward
head_mask=None,cx = cx ,cy = cy,height = height)
File "/home/jjrv/miniconda3/envs/layoutlm/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "/data_nas/jjrv/P_LayoutLM-CW/model/modeling_bert.py", line 336, in forward
extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility
StopIteration
next(self.parameters())
caused the error. I wonder if it’s related to pytorch version?