Hi all,
As part of manually updating parameters, I am trying to check for existence of a parameter.
I have code that looks like this:
try:
with torch.nograd():
model.linear_relu_stack[_i].weight=nn.Parameter(reshaped_params[_j])
except AttributeError:
print("Weight doesn't exist,going to next")
Is there a more elegant way to check for whether or not a layer has parameters at all?