Implementing mixture of RNNs: LSTM modules sharing parameters

I’m new to pytorch, i’ve switched my project from tensorflow because i need to works with dynamic computational graph and modularity but I found some difficulties.

I’m trying to implement a mixture of RNN, somenthing like this http://cvgl.stanford.edu/papers/jain_cvpr16.pdf .
I need to define different LSTM modules to use in different parts of the model, but they must share the same parameters. How can I do?

you define one LSTM module and reuse it at multiple places (this is completely fine).

Sry, was obvious. I had not yet understood how the torch modules work, it solves very easily.