Deploy pytorch model on spark

I have trained a model on GPU with PyTorch on python. Now I want to deploy the model on spark environment for production, I wonder how to deploy the model on Spark.

  1. Maybe I should write a java/scala code to load the parameters and make the prediction
  2. Maybe rewrite the code by keras and load the model by Deeplearning4j
  3. I wonder if there is a more easy way?

You have 2 options, either you run Databricks runtime for ML PyTorch | Databricks on AWS

Or you can deploy a Python model as a service and use that in your spark codebase