Parameter sharing along a dimension

Hi,
Suppose I have a weight tensor of shape (4,1), but I want to expand it along the second dimension so that it becomes (4,4) but the underlying storage does not change, and used as (16,1).
In code:

nn.parameter.Parameter(torch.Tensor(4,1).expand(4,4).view(16))

Note the above code will throw exception in the view() call, but is there (will there be) support for such usage?
Thanks!

You need to keep the Parameter of shape 4 x 1 and only do the expand in .forward.
You need reshape instead of view, as it’ll need to instantiate the larger matrix.

Best regards

Thomas

Right, thank you very much!