Multiple neural networks integrated into one

I am trying to recreate a paper where the neural network takes in N matrices (each are say 4 x 500) from different sources and tries to learn joined features.

Each matrix goes into a NN and gets converted into a 1 x 500 vector through convolutions etc.

These vectors are then stacked to make a N x 500 matrix which undergoes more transformations to produce a final 1x 500 vector.

The backprop etc must link all the sub neural networks so the final loss can backprop through to the filters learned in the sub networks.

Is this possible in pytorch?

YES. This should be taken automatically via autograd if you do the forward() correctly.