How to simultaneously quantize and normalize the output of a net?

How can you quantize, as well as normalize the output of a network?

For example say I have a net whose output I pass through a sigmoid. This quashes all the values to between [0,1], what I would like is normalized histograms, where the values of the bins are quantized to say 255 levels.

Here’s some code, to simulate the nets output

batch_size = 2
num_classes = 10
levels = 256
 
out = torch.randn( batch_size, num_classes )
out = torch.sigmoid(out)     # squash to between 0 and 1 

Now normalization

row_sums = torch.sum(out, 1) # normalization 
row_sums = row_sums.repeat(1, num_classes) # expand to same size as out
out = torch.div( out , row_sums ) # these should be histograms
 
torch.sum(out,1) # yay :) they sum to one

 1.0000
 1.0000
[torch.FloatTensor of size 2x1]

Now try to quantize each bin of the histogram, to values [0,1/256, ..., 1]

out = torch.mul( out , levels )
out = torch.round( out ) # use round not floor as floor will loose probability mass
out = torch.div( out , levels )
 
torch.mul(out,255) # yay :) they're quantized 

   24     7    24    19    41    25    38    33     9    34
   12    17    31    39    16    35    25    29    14    36
[torch.FloatTensor of size 2x10]

torch.sum(out,1) # oh dear - no longer normalized 

 0.9961
 0.9961
[torch.FloatTensor of size 2x1]

You already seem to have a solution :stuck_out_tongue: is this a question or an answer?