Pass parameter to next layer

Hi

How can I pass weights from the previous layer to the next layer?
Namely, I want to do dynamic parameter passing

I tried this strategy, but it doesn’t work. I get error ValueError: optimizer got an empty parameter list

def __init__(self):
    self.register_parameter('weight', None)
def reset_parameters(self, input):
    self.weight = nn.Parameter(input)    
def forward(self, input):
    if self.weight is None:
        self.reset_parameters(input)
    return self.weight

You would have to create the optimizer for it to work. After you created the optimizer for the first time, then you cannot add new parameters to the optimizer without creating a new optimizer.

You can manually set the weights parameters equal to the weights from another layer (if the shapes are the same!):

l1 = nn.Linear(5, 5)
l2 = nn.Linear(5, 5)

l2.weight = l1.weight
l2.bias= l1.bias

The gradients will be calculated for both weight parameters together!