Hello,
taking a look at the histogram observer, I realized that the min,max values of the incoming data are cast to int when creating or interpolating the histogram (see this line). I fail to see why that is necessary, or even a good idea.
I tried out just the histogram interpolation algorithm implemented in the observer, and for low-standard deviation data it gives a pretty bad approximation of the combined histogram; however, if the casting to int is removed, the approximation is far better.
Can anyone explain to me why the casting occurs?