Equivelent to keras NonNeg weight constraint in pytorch?

Keras has an option that can cause the weights of the model to be non negative :

tf.keras.constraints.NonNeg()

What is the equivelent of this in pytorch? lets say my model is the one below, how should i change it to force the weights to be non negative ?

class LogisticRegression(torch.nn.Module):
    def __init__(self, input_dim, output_dim):
        super(LogisticRegression, self).__init__()
        self.linear = torch.nn.Linear(input_dim, output_dim)

    def forward(self, x):
        outputs = self.linear(x)
        return outputs