(Suggestion/Question) Dynamic weight networks

Currently, there seems to be no (easy) way to set the weights of a layer to a non-Parameter Tensor (eg one produced by another layer). And my question is why?

I would image it would be fine to replace the parameters with any-given tensor (at which point the tensor would no longer appear in self.parameters() and I guess the side-effect of old optimizer taking steps in parameters that are not used any more is also fine).

Is this right or am I missing some important detail that makes this impossible?

For uses like this, use the functional API under torch.nn.functional.*