Torchserve - Custom handler for time-series forecasting - issue with response mismatched

Hello

I would like to ask for help, advice and hints about issue I am facing when deploying NN model for time-series predictions with custom handler.
I finally get working the model and inference for time-series prediction. Everything is working fine when i set prediction/output to 1.
But what i need is to get it working for predicting more then 1 values. My model and handler provide these output but I’m getting some errors from Torchserve.

Response i get from curl is
{
“code”: 503,
“type”: “InternalServerException”,
“message”: “number of batch response mismatched”
}
In LOG from Trochserve i see following ( i put there some print for debug):
Postprocess out: [0.6994340419769287, 0.9235875606536865, 0.7007522583007812, 0.8973164558410645, 0.7195687294006348]
[INFO ] W-9003-RNN_HR_model_1.0-stdout MODEL_LOG - model: RNN_HR_model, number of batch response mismatched, expect: 1, got: 5.

So i get 5 values in Inference.
According to documentation for handlers i see there:

def inference(self, model_input):
:return: list of inference output in NDArray

def postprocess(self, inference_output):
:return: list of predict results

When i check my output i see that
MODEL_LOG - The type of model output is <class ‘list’>
But even when i transform the inference output into NDArray and output for postprocess to list as provided in documentation I am still getting the same responses/errors

Any Idea what am i doing wrong? I can’t and dont want to use it in batch mode, since i have only 1 input and output should be 5 forecasted values.

Thank you

You can wrap your output as a List of Lists so you’ll only return a List with a single element so just add an extra []

Another user faced a similar issue here Return a list of dicts from prediction · Issue #2169 · pytorch/serve · GitHub

Great, it works!!!

Thank you very much for your help.
Thank’s a lot.
You really helped me a lot.

1 Like