Parameter sharing along a dimension

Suppose I have a weight tensor of shape (4,1), but I want to expand it along the second dimension so that it becomes (4,4) but the underlying storage does not change, and used as (16,1).
In code:


Note the above code will throw exception in the view() call, but is there (will there be) support for such usage?

You need to keep the Parameter of shape 4 x 1 and only do the expand in .forward.
You need reshape instead of view, as it’ll need to instantiate the larger matrix.

Best regards


Right, thank you very much!