Time-series predictions from a dataloader instead of list

I am using an LSTM for some time-series modelling. The model runs fine and I want to generate future predictions from my model. I have the code below that will generate N predictions into the future. Basically loop will take an input series, make a prediction on the t+1 step, and then append that step to the end of the input series. The loop will then include that prediction when prediction the t+2 step, and so forth, until I get to t+N steps.

The code looks like this, but it does not feel very polished from a numerical perspective. Since I am using a list to hold the input series, there are memory constraints on how far forward I can predict and how many series I can predict, etc. My question was, is there a way to write this code below using a DataLoader, instead of a list–meaning, can I dynamically add elements on to the end of a DataLoader? That way I imagine I can use an iterator instead of a list and hence not worry about memory, etc.

def generate_predictions(prediction_window: int, data: list, model: nn.Model):
    model.eval()
    for i in range(prediction_window):
      seq = torch.FloatTensor(data[-prediction_window:])
      seq = seq[np.newaxis, :, np.newaxis]
      #import pdb; pdb.set_trace()
      with torch.no_grad():
        seq = seq.to(device)
        data.append(model(seq).item())
    model.train()

I see it’s been quite some time since you posted this, but I’ve recently run into the same issue. Have you found a solution to it?