Loading model from pytorch_pretrained_bert into transformers library

I have a pretrained BERT model for a classification task trained on the pytorch_pretrained_bert library. I would like to use the initial weights from this model for further training with transformers library.

When I try to load this model, I get the following runtime error. Does anyone know what this is?

model = BertClassification(weight_path= pretrained_weights_path, num_labels=num_labels)
state_dict = torch.load(fine_tuned_weight_path,map_location=‘cuda:0’)
model.load_state_dict(state_dict)

RuntimeError: Error(s) in loading state_dict for BertClassification:
Missing key(s) in state_dict: “bert.embeddings.position_ids”.

Thanks very much.