Request for Sample PyTorch Implementation of Variable Length Many-to-One LSTM/GRU Regression with monotonically increasing constraints on intermediate output values

Hello all, new member of the PyTorch community here.

I have been trying to find PyTorch code examples online to implement an RNN with multiple constraints specific to the problems I’m currently working on for my side project but found no success in doing so.

I want to build an RNN model (ideally using LSTMs or GRUs) with the following criteria:

  • The model accepts variable length inputs with no upper limit on the length of the input.
  • The model has a many-to-one RNN framework (input is sequential data with many features, output is one value)
  • The model produces a positive output value (float) for any input received.
  • The model’s intermediate outputs in the RNN monotonically increases as more timesteps are observed in the architecture.

Using a picture as reference here (https://scientistcafe.com/ids/images/rnnrollout.png), respectively:

  • The model should handle varying <t> values
  • Only the true value y^{<t>} is known in model training.
  • \hat{y}^{<t>} > 0 for all inputs received in the model, and \hat{y}^{<t>} is type float.
  • \hat{y}^{<1>} \leq \cdots \leq \hat{y}^{<t-1>} \leq \hat{y}^{<t>} for all <t>

Any explanations and some reproducible code implementations using synthetic dummy data would greatly help me! Thank you!