Constrain Weight Matrix to Sum to 1

For example a simple y = AX linear layer, I would like the weights X to sum to 1 at most and > 0 for each individual weight, any idea what’s the ideal way to do this?

y = 0.1 x_1 + 0.5 x_2 + 0.4 x_3

class simple_fnn(nn.Module):
    def __init__(self, num_hidden_units=1):
        super(simple_fnn, self).__init__()
        self.num_hidden_units = num_hidden_units

        self.fc_single = nn.Linear(6, self.num_hidden_units, bias=False)

    def forward(self, x):
        x = self.fc_single(x)

        return x
1 Like

I think you can combine softmax with your parameters.

Maybe you should define your own module, and do following:

def forward(self, x):
    w_normalized = nn.functional.softmax(self.weight)
    output = nn.functional.linear(x, w_normalized)

    return output

Then, you can train the model that satisfies your constraint.

1 Like

Can’t seem to make this work, any idea anyone @apaszke @smth ? I got the following error:

ValueError: optimizer got an empty parameter list

I tried the following but the clipper doesn’t seem to work, still get negative weights despite softmax clip. Seems like it’s not assigning the new values to replace the original weight matrix.


class MaxOneClipper(object):

    def __init__(self, frequency=1):
        self.frequency = frequency

    def __call__(self, module):
        # filter the variables to get the ones you want
        if hasattr(module, 'weight'):
            w = module.weight
            sm = nn.Softmax()
            w = sm(w)

clipper = MaxOneClipper

for i in range(10):
    net.apply(clipper)

HI @ritchieng,
Have you solve the problem ?

I’m interested in the answer .

Thanks