Utilising Serialized Torch Models in Apache Spark (scala)

Hi it’s Anmol Sinha, a Senior Big Data Engineer,

So for few of my Neural Network codes, I coded Emotion Recognition from Text and Transfer Learning using BERT/Hugging face or other older models. And tried to serialize it into .pt/.pth files for pickling them. I can use the python library “sparktorch” to use the them in Apache Spark dataframe in PySpark.

But I am essentially a Scala / Java developer for purposes of my organization. And wanted to know how to deserialize the *.pt pickled models encoded and saved on to my system, and utilize them in Scala/Java which both run on JVM. Can ONNX tools help ?