Dear experienced ones:
I wonder how to remove the intermediate layers of a model. I know about this method below:
The problem that my model has pooling layers and ReLU layers which are applied in the
forward() method of the model and those modules would be missed using
The cleanest approach would be to derive a custom model class from the model class as its base and override the
forward method with your new definition.
I would not recommend to try to wrap submodules into an
nn.Sequential container for models, which might be more complicated than a few sequential layers.
Thanks for the hints.
I need to make the process automated and general, no matter what the model is.
Is there anyway to figure out if the model uses softmax in the
Still it is not clear how to remove that layer while keep the operators in the
forward exactly same as original model.
You could “remove”
nn.Modules by replacing them with
However, this won’t work for functional calls in your model, since (as you already explained), these methods are not registered as modules, so you would need to override the