How to initialize nn.Parameter in an applied function?

I have the following function I’m using to initialize weights for testing purposes. (I need a deterministic function that is something non-trivial. I.e, not all 0s or 1s).

def init_weights(m):
    # Avoid random initializations
    if isinstance(m, nn.Linear) or isinstance(m, nn.Conv3d) or isinstance(m, nn.LayerNorm):
        print("initializing other layers")
	# Choose this b/c it's deterministic
    elif isinstance(m, nn.Parameter):
        print("initializing parameters")
        raise Exception("unknown layer")

The problem is, this function does not seem to initialize parameters (i.e., nn.Parameter). I can tell b/c initializing parameters is never printed. Is there a good solution here?

I suppose the correct follow-up question is: What function can I use to deterministically initialize weights non-trivially?

My answer to the 2nd question (also indirectly solves my 1st question):

def deterministic_fill_(p):
    gen = torch.Generator()

    shape = p.shape
    weights = torch.rand(shape, generator=gen) = weights

def deterministic_init(m):
    params = 0
    for n, p in m.named_parameters():
        params += 1
    print(f"{params} params deterministically initialized")