It might be related to the wrapping in an nn.Sequential
module.
nn.Sequential
modules are used for quite simple models and I assume the currently used one might not work out of the box using this approach.
If you want to use some activations, I would recommend to use forward hooks as described here and process these activations further.