I have trained a model on GPU with PyTorch on python. Now I want to deploy the model on spark environment for production, I wonder how to deploy the model on Spark.
- Maybe I should write a java/scala code to load the parameters and make the prediction
- Maybe rewrite the code by keras and load the model by Deeplearning4j
- I wonder if there is a more easy way?