#Time series dataset #Anomaly detection #Stateful LSTM

I want to make anomaly detection model.
so I’m trying to train normal data pattern with Stateful LSTM.
I find the code about stateful lstm predictor but the code is coded with tensorflow.
It’s very easy to implement stateful lstm mode.
I have to only implement an hyperparameter when coding to LSTM model.

Now I want to make the model to pytorch code.
How can I make the model?
Is there anything to know before starting?

(ps. what is repeat vector?
whate is unroll in lstm hyperparameter?)

the stateful lstm model with tensorflow is below.

class StatefulMultiStepLSTM(object):

def __init__(self,batch_size, look_back, look_ahead, layers, dropout, loss, learning_rate):
    self.batch_size = batch_size
    self.look_back = look_back
    self.look_ahead = look_ahead
    self.n_hidden = len(layers) - 2
    self.model = Sequential()
    self.layers = layers
    self.loss = loss
    # self.optimizer = optimizer
    self.learning_rate = learning_rate
    self.dropout = dropout
    logging.info("StatefulMultiStepLSTM LSTM Model Info: %s" % (locals()))

def build_model(self):
    # first add input to hidden1  
    self.model.add(LSTM( units=self.layers['hidden1'],
        # unroll=True,
        # return_sequences= True if self.n_hidden > 1 else False)) 
    self.model.add(Dropout(self.dropout)) #dropout= 0.1  10% drop

    # add hidden layers
    for i in range(2, self.n_hidden + 1):
        return_sequences = True
        if i == self.n_hidden:
            return_sequences = False
        self.model.add(LSTM(units = self.layers["hidden" + str(i)], stateful=True,return_sequences=return_sequences,unroll=True))

    # add dense layer with output dimension to get output for one time_step

    # Repeat for look_ahead steps to get outputs for look_ahead timesteps.


    # add activation

    # compile model and print summary
    start = time.time()
    self.model.compile(loss=self.loss, optimizer=Adam(lr=self.learning_rate,decay= .99))
    #self.model.compile(loss=self.loss, optimizer=Adam(lr=self.learning_rate))
    logging.info("Compilation Time : %s" % str(time.time() - start))
    return self.model

Hello, I think this can be helpful.