Running multiple LSTM in parallel

Hi there, I have a case (in Q&A system) where I need to run multiple LSTM in parallel.
At below architecture we can see that the emedding for both passage and question are both able to run independently, my question would be how to run both in parallel?
Screenshot%20from%202018-05-31%2011%3A44%3A41

Let say I have the following code

class Encoder(nn.Module):
	def __init__(self):
		self.passage_bilstm = nn.LSTM(300, 300, 1, bidirectional=True)
		self.question_bilstm = nn.LSTM(300, 300, 1, bidirectional=True)
		...
		
	def forward(self, passage, question):
		emb_passage = self.embedding_passage(passage)
		emb_question = self.embedding_question(question)
		
		# If I run as the following, I assume it may be executed sequentially right?
		# If my assumption is correct, do we have a way to run these in parallel so that
		# question_bilstm doesn't wait passage_bilstm?
		self.passage_bilstm(emb_passage)
		self.question_bilstm(emb_question)

Or my assumption is wrong?

Hello, akurniawan. I’m new to pytorch and I have the same question with you. May I ask if you have solved the problem?
I guess that in your code,

self.passage_bilstm(emb_passage)
self.question_bilstm(emb_question)

the python code is in sequentially but cause the pytorch is run from the graph, and in the compute graph the two layer is in parallel. So, although you write the code in sequence, but it will run in parallel actually. That’s my assumption. But I didn’t test it. Am’I right? Thank you !!!