Lazy initialization of layers and parameters: "Class Linear does not have an __init__ function defined"

My high-level goal is to compactly serialize model architectures without parameter values. See Save model architecture only

To accomplish this goal I’ve tried to lazily initialize layers and parameters in the forward() method.

class Net4(nn.Module):

    def forward(self, x):
        if not hasattr(self, 'fc1'):
            self.fc1 = nn.Linear(16 * 5 * 5, 120)
        if not hasattr(self, 'fc3'):
            self.fc2 = nn.Linear(120, 84)
        if not hasattr(self, 'fc3'):
            self.fc3 = nn.Linear(84, 10)

        x = F.sigmoid(self.fc1(x))
        x = F.sigmoid(self.fc2(x))
        x = self.fc3(x)
        return x

script_module4 = torch.jit.script(Net4())
script_module4.save('pytorch_model.Net4.TorchScriptModule')

I’m getting the following error:

/opt/conda/lib/python3.7/site-packages/torch/jit/_recursive.py in create_methods_and_properties_from_stubs(concrete_type, method_stubs, property_stubs)
    302     property_rcbs = [p.resolution_callback for p in property_stubs]
    303 
--> 304     concrete_type._create_methods_and_properties(property_defs, property_rcbs, method_defs, method_rcbs, method_defaults)
    305 
    306 

RuntimeError: 
Class Linear does not have an __init__ function defined:
  File "<ipython-input-23-bda237bd2209>", line 8
    def forward(self, x):
        if not hasattr(self, 'fc1'):
            self.fc1 = nn.Linear(16 * 5 * 5, 120)
                       ~~~~~~~~~ <--- HERE
        if not hasattr(self, 'fc3'):
            self.fc2 = nn.Linear(120, 84)

How can I work around this error?