Shared model instances in multiple flask api

I have a sample Flask app like this:

from flask import Flask
app = Flask(__name__)
model = pytorch_model_instance

def do_task1():
    pred = model.predict()
    return pred

def do_task2():
    pred = model.predict()
    return pred

The above 2 endpoints could use the model simultaneously for prediction, my question is if i define a single model for these 2 endpoints, are there any problems if both of them access the model instance at the same time for prediction?