Create a network where the number/type of layers are given as parameter input

Hi,

Is there any possibility to build a network with variable number of layers ?

for L number of inputs we have L first layers which are run in parallel and don’t share parameters. L is given as parameter.

To make it simple let’s suppose that all the layers to create are linear.

At a given intermediate layer the results are concatenated.

Here is my pseudo code.

class  My_net(nn.Module):

    def __init__(self, net_parameters,args):

	d,n,m,c,n_classes,number_of_layers=net_parameters

	self.designed_layers=[]

	for  layer in number_of_layers :
             self.layer=nn.Sequential(torch.nn.Linear(m,m*c),torch.nn.ReLU(True))
	     self.designed_layers.append[self.layer]


	self.fc1 = nn.Linear(n*m, n_classes)
    

    def forward(self,x,args.number_layers):

	
	new_x=[]

	for layer in number_layers:
		x_layer=self.designed_layers[layer](x[layer])
		new_x.append(x_layer)
	x=torch.cat(new_x)
	x=self.fc1(x)
	


     return x
     
Thank you