Implementing the non-negative logistic regression in pytorch for single output classification with with all positive weights?

I read some tutorials for logistic regression in pytorch but they didn’t explain how can i implement the non-negative logistic regression model?

basically i have a linear classifier that i need to output a single probability number (i assume using sigmoid), and i need to have all the weights of the network to be non-negative

it was suggested that i should use non-negative variant of logistic regression to have non-negative weights but I’m not sure how to implement this? my model currently is a simple 1 layer linear classifier.

any help is appreciated