In the lua version of LSTM (GitHub - Element-Research/rnn: Recurrent Neural Network library for Torch7's nn) there was a rho parameter indicating the amount of backprob trough time (BPTT). Where is this parameter in pytorch version of LSTM?
The nn.LSTM(inputSize, outputSize, [rho]) constructor takes 3 arguments:
inputSize : a number specifying the size of the input;
outputSize : a number specifying the size of the output;
rho : the maximum amount of backpropagation steps to take back in time. Limits the number of previous steps kept in memory. Defaults to 9999.