I am trying to run a simple predict, but it keeps eating memory. I am not sure is it a fastai or torch issue and I have no clue why.
docker: docker Ubuntu18.04, python3.6 container
from fastai.basic_train import load_learner predictor = load_learner(path, "file.pkl") pred_class, pred_idx, outputs = predictor.predict("string")
I cant reproduce the memory leak on my mac or linux machine, so I think it is docker related problem, that maybe some missing lib. Any thought?