Where is rho in LSTM?

In the lua version of LSTM (GitHub - Element-Research/rnn: Recurrent Neural Network library for Torch7's nn) there was a rho parameter indicating the amount of backprob trough time (BPTT). Where is this parameter in pytorch version of LSTM?

The nn.LSTM(inputSize, outputSize, [rho]) constructor takes 3 arguments:

inputSize : a number specifying the size of the input;
outputSize : a number specifying the size of the output;
rho : the maximum amount of backpropagation steps to take back in time. Limits the number of previous steps kept in memory. Defaults to 9999.

This is not provided in PyTorch.

@smth Ok, so I should always provide the full sequence that I want to train? something Like [batch_size,rho,input_size] (when the batch_is first)