How to check existence of parameters?

Hi all,

As part of manually updating parameters, I am trying to check for existence of a parameter.

I have code that looks like this:

try:
    with torch.nograd():
     model.linear_relu_stack[_i].weight=nn.Parameter(reshaped_params[_j])
except AttributeError:
    print("Weight doesn't exist,going to next")

Is there a more elegant way to check for whether or not a layer has parameters at all?

Pythin hasattr or getattr with a default should work.

Best regards

Thomas