Bert multilingual training

I was trying to finetune a multilingual_bert_cased model using my custom dataset but for certain datapoints the embedding structure mismatch leads to error. As per the multilingual config.json max_position embedding is 512 so anytime the token type embedding and input embedding is greater than 512 it is not able to get added. Is there any solution to include those data points in training or BERT has this issue with sentences that have bigger embeddings than the data sample it was pretrained?

following is the error in terminal

torch.Size([2, 97, 768])
torch.Size([1, 97, 768])
torch.Size([2, 97, 768])
torch.Size([2, 617, 768])
torch.Size([1, 512, 768])
torch.Size([2, 617, 768])
Traceback (most recent call last):
File “/users/sbhatta9/conditionedbert/bert_using_trainer.py”, line 115, in
train(train_dataset)
File “/users/sbhatta9/conditionedbert/bert_using_trainer.py”, line 105, in train
trainer.train()
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/transformers/trainer.py”, line 707, in train
tr_loss += self.training_step(model, inputs)
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/transformers/trainer.py”, line 994, in training_step
outputs = model(**inputs)
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 722, in _call_impl
result = self.forward(*input, **kwargs)
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/transformers/modeling_bert.py”, line 1150, in forward
return_dict=return_dict,
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 722, in _call_impl
result = self.forward(*input, **kwargs)
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/transformers/modeling_bert.py”, line 828, in forward
input_ids=input_ids, position_ids=position_ids, token_type_ids=token_type_ids, inputs_embeds=inputs_embeds
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 722, in _call_impl
result = self.forward(*input, **kwargs)
File “/users/sbhatta9/sumanta/lib/python3.7/site-packages/transformers/modeling_bert.py”, line 214, in forward
embeddings = inputs_embeds + position_embeddings + token_type_embeddings
RuntimeError: The size of tensor a (617) must match the size of tensor b (512) at non-singleton dimension 1