GRU produces same output for all the inputs

Hi All,

I am using a Neural network which consists of GRU and other modules. For all different input, the GRU layer produces the same output. What could be the reason? But other layers are producing different outputs. (GRU is the final layer)

Thanks in advance.