Thanks for replying.
It has limitations without the code but I think PyTorch models can store all the needed computation graph by itself. Just like TensorFlow, MXNet and other frameworks.
This is really needed for the general serving service. We try to implement the service to load PyTorch models with user’s model files but not source files.