I implemented a custom Module that consists of several Convolution. However, I couldn’t apply weight norm because it use getattr(module, ‘weight’) to get the weight variable. Is there any way to apply weight_norm directly to a module that has multiple sub-module or weight ?