Histogram Observer Implementation Error

It seems to me that the HistogramObserver implementation has a bug. In L 1181 the forward pass is as follows:

combined_histogram = self._combine_histograms(
    combined_histogram,
    self.histogram,
    self.upsample_rate,
    downsample_rate,
    start_idx,
    self.bins,
)

Where _combine_histograms is defined as:

def _combine_histograms(
    self,
    orig_hist: torch.Tensor,
    new_hist: torch.Tensor,
    upsample_rate: int,
    downsample_rate: int,
    start_idx: int,
    Nbins: int,
) -> torch.Tensor:

Therefore, combined_histogram gets mapped to orig_hist while self.histogram gets mapped to new_hist. It seems to me like it should be the other way around given the variable names.

Can someone clarify?

Thank you!!

will address your question in your github issue: Bug in Histogram Observer Implementation · Issue #87126 · pytorch/pytorch · GitHub