Just what model should I use for my data? Evaluation and graph description of time series data

2 Infrared Sensors have been deployed in an experiment. They basically function as a simple light barrier, giving out low voltage values when something is between sensor and emitter and high values for a clear path. Objects regularly pass through both sensors right after one another. Parameters of interest are the current object speed (obtainable through sensor distance and the time delay between sensors), object length (from speed and low voltage time), and distance between objects (from speed and high voltage time).
It is actually not that hard to just calculate it using traditional programming, but I am curious if I can solve this problem using machine learning. Also, it could be more efficient when using hundreds or thousands of such setups to evaluate experiments in parallel. Thus I need to solve this with machine learning, yet I just can’t find the right model to make it work.

I have tried using a normal RNN layer, an LSTM layer, multiple LSTM layers (plus linears in between) and a combination of 2 LSTMCells. For Optimizers I tried normal gradient descend, sgd and Adam. I also tried solving it with the LBFGS Algorithm but that did not work either.
I basically need a time averaged output from an alternating input function, yet the average should also change over time according to changes in the input data. The biggest problem i have so far, is that I usually get one constant, averaged value as output after training the network, no matter the optimizer or model I use. So for example to make this as clear as possible, if the object speed during the experiment changes once per minute to a random value between 10 and 30 m/s, the trained network will most definitely always output 20 m/s. No matter what is used as the current input.

If anyone has experience with similar problems or ideas on which model / algorithm / optimizer I should try, please let me know. I am very thankful for all opinions on the matter!