How to understand this '*' in this transfered learning model?

trained_model=torchvision.models.resnet18(pretrained=True)
model=nn.Sequential(*list(trained_model.children())[:-1],
                                   nn.Flatten(),
                                   nn.Linear(512, 5)
                                  )

I guess I understand how the transfered learning work here:
in the new model:
*list(trained_model.children())[:-1] must be (I guess) working as a fixed function without introducing any new parameters. So when you train model, you will only update the parameters introduced by nn.Linear(512, 5).
In this way, the transfered learning is done, am I right?

What I don’t understand is this *, what it does?
I can understand I used the first 17 children layers of the trained _model by using list(trained_model.children())[:-1], but isn’t this already enough? why add a * here?
Thanks.

1 - In your way, in training process, you update all params in the network. If you wanna just train last 512,5 layer you have to freeze all layers before last.
2 - * operator used for unpacking the all layers from renset because nn.Sequential’s input must be nn.Modules not list(nn.Module)

Thanks, willdone!

1: Really? Then how to freeze those layers?
2: is there a way to get 17 layers without using * and list()?

Thanks.

1 - for i, (n,p) in enumerate(model.named_parameters()):
condition for freezing:
p.requires_grad = False

2 - you can use adding layers to nn.ModuleList but then you have to go forward in the cycle but how I know nn.sequential is working faster, and *list is easy way to handle it

Thanks, willdone1337!