How to export a model that without forward() function

Just like title,if i got a “bone” of a neural network without any specific forward() function like:

def forward(x):
      raise NotImplementedError

Is that possible to export this model into ONNX or even TensorRT someway?

Without the forward implementation PyTorch wouldn’t know how the initialized modules and parameters should be used in the forward pass and I would assume the same applies to ONNX and TensorRT.

Sorry late for the reply sir.
So can i just moving the code which in inference() function into forward to make Pytorch model recognizable?

Seems Pytorch recognize my model as expect,but there is another problem happend:

torch._dynamo.exc.UserError: Tried to use data-dependent value in the subsequent computation. 
This can happen when we encounter unbounded dynamic value that is unknown during tracing time.   
You will need to explicitly give hint to the compiler.
Please take a look at constrain_as_value OR constrain_as_size APIs.  
It appears that you're trying to get a value out of symbolic int/float whose value is data-dependent (and thus we do not know the true value.) 
The expression we were trying to evaluate is i5 < 0 (unhinted: i5 < 0). 
Scroll up to see where each of these data-dependent accesses originally occurred.