Size of exported onnx model

I trained a model(model A) using pytorch whose forward method depends on the outputs from another model(model B). Model B can be thought of as a feature extractor for model A.
I saved the state_dict of model A only after training it. As expected, the size of the saved .pth file was small since I only saved the trained parameters of model A.
However, when I try to export this model A into onnx format using torch.onnx.export, the size of the resultant onnx model is very large and takes the parameters of model B too. Is this the expected behavior? I know onnx model runs the forward (which involves calling model B) method of model A before exporting it. But does it necessarily have to include the parameters of model B along with model A?

For the ONNX file to run standalone, it has to contain both the architecture definition and all model weights required to compute the forward path. Given this, it makes sense to me that model B parameters would need to be included