How to scale the output range of a network

I am trying to limit the range of values that a tensor can take in the forward pass like this:

``````class ScaleRange(nn.Module):
def __init__(self, max_):
super(ScaleRange, self).__init__()
self.max = max_
self.inv_scale = 255.0/max_
self.scale = max_/255.0
self.clamp = nn.Hardtanh(min_val=0.0, max_val=self.max)

def forward(self, x):
out = self.clamp(x)
out = torch.mul(out, self.inv_scale)
out = torch.floor(out)
out = torch.mul(out, self.scale)
return out
``````

Here I take the input tensor, clamp it to (0, max), then scale it by 255/max, floor the result, and then scale it back again by max/255, thus restricting the range of values to (0, 1.0). Is this the correct way to do this?

I donâ€™t think your code is doing, what you try to achieve.
Letâ€™s think about what each line of code is doing.

Assuming you have a random tensor and set `max_` to a random value, e.g. 2.
The f`clamp` operation would therefore cut all values below 0.0 and above 2.0:

``````x = torch.randn(10) * 10
max_val = 2.
clamp = nn.Hardtanh(min_val=0.0, max_val=max_val)
x = clamp(x)
print(x)
> tensor([2.0000, 0.0000, 2.0000, 2.0000, 0.0000, 0.0000, 0.0000, 0.4396, 2.0000,
2.0000])
``````

`x` holds currently values in `[0, max_]`.
Scaling it with `255./max_` will let `x` be in the range `[0, 255.]`:

``````inv_scale = 255./max_val
out = x * inv_scale
print(out)
> tensor([255.0000,   0.0000, 255.0000, 255.0000,   0.0000,   0.0000,   0.0000,
56.0499, 255.0000, 255.0000])
``````

Now the `floor` basically just has an effect on values between `0.` and `255.`, as the min and max values are already â€śflooredâ€ť:

``````out = torch.floor(out)
print(out)
> tensor([255.,   0., 255., 255.,   0.,   0.,   0.,  56., 255., 255.])
``````

Multiplying it with `scale = (max_/255.)` will just set the range again back to `[0, max_]`:

``````scale = max_val/255.
out = out * scale
print(out)
> tensor([2.0000, 0.0000, 2.0000, 2.0000, 0.0000, 0.0000, 0.0000, 0.4392, 2.0000,
2.0000])
``````

We are basically back at the clamped tensor with a little difference due to the `floor` op.

If you want to scale your tensor to the range `[0, 1]`, you could subtract the minimal values and divide by the maximal:

``````x = torch.randn(10)
x -= x.min()
x /= x.max()
``````
2 Likes