Resetting parameters of Pooling, Activation and Normalization layers

I can initialize some layers in my network using in-built weight/bias initializations such as:

def weight_init(m):
    '''
    Usage:
        model = Model()
        model.apply(weight_init)
    '''
    if isinstance(m, nn.Conv1d):
        m.reset_parameters()
    elif isinstance(m, nn.Linear):
        m.reset_parameters()

However, i also use ReLU/Leaky_ReLU activation layers and InstanceNorm1d/AvgPool1d layers. However these layers have no reset_parameters() function. Do i not need to reset these layers? My InstanceNorm1d layer have no learnable parameters.

Regards

Ditlev

No, you don’t need to reset or initialize activation functions like ReLU, since they do not contain any parameters which could be initialized.

1 Like