Exponential Smoothing for Time Series Forecasting in Pytorch

Does anyone know of a Pytroch implementation of Exponential Smoothing (ES)? Since ES is a local model, each time series in a dataset would require its own set of ES params (alpha, beta, etc.). Just wondering how this would be implemented. Any help would be appreciated.

It is a pain to do, as implementations with python loop + gradients are slow. You may need something similar to torchaudio.functional.lfilter (that resorts to a C++ loop and doesn’t support backprop I think).

Well, options are obvious:

  1. use multiple fixed (never trained) parameter sets, concatenate outputs as an input for a downstream task (e.g. pass them to a rnn layer)
  2. train “amortized” params conditioned on some time series metadata
  3. train “unamortized” params (nn.Embedding lookups by time series id) - with no straightforward way to do out-of-sample predictions
1 Like