Pretraining exsiting bert model with new features (continous variables such as time & temp)

I need a suggestion of using an existing bert model which is pre-trained for sentence classification.

the existing model accepts text in form of: “ClC1=CC=CC(Cl)=C1C=O.ClC1=CC=CC(Cl)” which is a chemical reaction for example. Now to enhance the model where I can use continuous values such as time and temperature as features to it to retrain the model on my dataset.

my idea is to use an ensemble approach, where I take the last layer output values of the existing bert model, create another model to just accept the continuous variables and concat them in an ensemble approach similar to this link: Combining Trained Models in PyTorch

Any alternate approaches or suggestions? or links to use as a resource for this implementation

Thanks
Rahul Raj Devaraja