Latent Code for Recurrent Encoder

Hi All,

I am building a recurrent AutoEncoder using PyTorch. For the encoder, I am using GRU with 2 layers.
Assume our input data is a univariate time-series data in the shape (B,S,1) where B-batch size and S- sequence length. The encoder(GRU with 2 layers) produces 2 outputs (one is the actual output and other one is the hidden states). Can somebody explain which one should I use as the latent code for the AutoEncoder?

Regards
Pranavan