Keras like behaviour is needed


Implementing an autoregressor in Keras is something like the following code snippet.

def network_autoregressive(x):

    ''' Define the network that integrates information along the sequence '''

    x = keras.layers.GRU(units=256, return_sequences=False, name='ar_context')(x)

    return x

Here the input x is 32x4x128. And the output is 32x256. I am not able to produce a similar behaviour using pytorch GRU. Any help is much appreciated.


Thanks for the people who read this topic. I found the solution. In keras, return_sequences=False actually returns the final output in the sequence dimension. Hence, it can be sliced from the GRU output of pytorch.

1 Like