Does the "script" method in script module work for nested model (e.g. a resnet)?)

Hi, I’m new to scriptmodule. I’m confused, can the “script” method return a “script module” for a nested model? For example, I have the following resnet defined in python, here InBlock, ResBlock, OutBlock are standard CNN with several layers of conv2d.

class Net(nn.Module):
    def __init__(self):
        self.inblock = InBlock()
        self.res_block_1 = ResBlock()
        self.res_block_2 = ResBlock()
        self.res_block_3 = ResBlock()
        self.res_block_4 = ResBlock()
        self.res_block_5 = ResBlock()
        self.res_block_6 = ResBlock()
        self.res_block_7 = ResBlock()
        self.res_block_8 = ResBlock()
        self.outblock = OutBlock()
    def forward(self,s):
        s = self.inblock(s)
        s = self.res_block_1(s)
        s = self.res_block_2(s)
        s = self.res_block_3(s)
        s = self.res_block_4(s)
        s = self.res_block_5(s)
        s = self.res_block_6(s)
        s = self.res_block_7(s)
        s = self.res_block_8(s)
        s = self.outblock(s)
        return s
net = Net()
sm = torch.jit.script(net)'')

Question: I’m confused, 1) does the “script” method work in this situation? 2) If I want to load and train the model with the following C++ code, are ALL the parameters of the model loaded? Or just a subset of the parameters loaded because the “script” method does not work recursively in libtorch?

auto module = torch::jit::load("");
        std::vector<torch::jit::IValue> inputs;
        std::vector<at::Tensor> parameters;
        for (const auto& params : module.parameters()) {
        torch::optim::Adam optimizer(parameters, 0.1);

Thanks a lot.

torch.jit.script should script all properly registered modules recursively and your libtorch export should thus also contain all modules and parameters.