For some reason, I woud like to use the upper triangular tensor as model parameters. I used the self.L = nn.Parameter(torch.rand((d, d)).triu()) to specify the parameters during initialization but this is not right as the total parameter number is still d*d. What I want is to fix the lower part to be 0 all the time. I think I have to unregister those numbers but I didn’t find a clean way to do so. I appreciate if anyone can help me.

An entire tensor is a single Parameter. You can’t unregister just some
elements of a tensor.

Your best bet will be to “freeze” the lower triangle at zero by zeroing out
that part of the tensor’s gradient. Here’s a post from @ptrblck that shows
how to do this: