I can initialize some layers in my network using in-built weight/bias initializations such as:
def weight_init(m): ''' Usage: model = Model() model.apply(weight_init) ''' if isinstance(m, nn.Conv1d): m.reset_parameters() elif isinstance(m, nn.Linear): m.reset_parameters()
However, i also use ReLU/Leaky_ReLU activation layers and InstanceNorm1d/AvgPool1d layers. However these layers have no reset_parameters() function. Do i not need to reset these layers? My InstanceNorm1d layer have no learnable parameters.