I have trained a model on GPU with PyTorch on python. Now I want to deploy the model on spark environment for production, I wonder how to deploy the model on Spark.
- Maybe I should write a java/scala code to load the parameters and make the prediction
- Maybe rewrite the code by keras and load the model by Deeplearning4j
- I wonder if there is a more easy way?
You have 2 options, either you run Databricks runtime for ML PyTorch | Databricks on AWS
Or you can deploy a Python model as a service and use that in your spark codebase https://github.com/pytorch/serve