How to tell c++ optimizer step to not allow tensor value to be below zero

Hi all - I have an auto grad tensor and I do not with negative values to be allowed - how can I get SGD to not send it below zero?


If you want to restrict a parameter to hold only positive values:

It might look something like:

import torch
import torch.nn as nn
import torch.nn.utils.parametrize as parametrize

class Positive(nn.Module):
    def forward(self, X):
        return torch.clamp_min(X, 0)

layer = nn.Linear(3, 3)
parametrize.register_parametrization(layer, "weight", Positive())

This is only available in versions >= 1.9.0 though.

1 Like