I am trying to limit the range of values that a tensor can take in the forward pass like this:
class ScaleRange(nn.Module):
def __init__(self, max_):
super(ScaleRange, self).__init__()
self.max = max_
self.inv_scale = 255.0/max_
self.scale = max_/255.0
self.clamp = nn.Hardtanh(min_val=0.0, max_val=self.max)
def forward(self, x):
out = self.clamp(x)
with torch.no_grad():
out = torch.mul(out, self.inv_scale)
out = torch.floor(out)
out = torch.mul(out, self.scale)
return out
Here I take the input tensor, clamp it to (0, max), then scale it by 255/max, floor the result, and then scale it back again by max/255, thus restricting the range of values to (0, 1.0). Is this the correct way to do this?