Learning correlation parameters of two signals

Hey folks,

Suppose that I have two signals, where one is basically a delayed and scaled version of the other, plus some noise for fun. I am trying to learn that delay and scale.

So, my input features to my model are the last N samples of the first signal, and I want to create/learn a model that will predict the second signal from the recent samples of the first.

The scale part is simple enough to learn, as I can just create a pytorch variable for it, and just multiply my input features by that value in my module’s forward function. I’m not sure how to code up the delay part in a way that makes that parameter learnable. It is essentially having to choose which sample of the input to output (or, because that delay is a float value, it would have to interpolate between two adjacent samples).

So, what is the best way to achieve what I want? I can achieve this with other packages (say, a nonlinear optimizer, or using scikit’s signal processing functions), but I am wondering how to do it in pytorch.