How do I access my Conv/Linear weights without touching other learnable parameters

(I am relatively new to PyTorch so sorry if it is too basic)
I want to access my convolutional and linear weights for regularization. However I have some PReLU parameters in my model that If I apply my regularization there, it will affect negatively my results.

Here is a toy model example:

#The Model
class TestModel(nn.Module):
    def __init__(self):
        super(TestModel, self).__init__()
        self.input_conv= nn.Conv2d(3, 64, 5, padding=2)
        self.i_activation= nn.PReLU(64)
        
        self.downconv1= nn.Conv2d(64, 64, 3, padding=1)
        self.d_activation1= nn.PReLU(64)
    def forward(self, x):
        x= self.input_conv(x)
        x= self.i_activation(x)
        
        x= self.downconv1(x)
        x= self.d_activation1(x)
        return x

#Instance of the model
model=TestModel()

Itterating with model.parameters does not work since PReLU stands in the way. Is there a kind of loop/function that I can use so that I can go through this? Thanks in advance

You could use model.named_parameters() which will return the name as well as the parameter and then use a condition on the name. Alternatively, you could also directly access the desired weight parameter via model.input_conv.weight.