Dynamic Network Creation

I have a JSON file wherein I have defined a network (Consists of Conv and Dense layers)

I want to create a Network dynamically based on this JSON file, thus I want my network to have layers according to file. Thus my layer can either have 4 layers or 10 layers.

How can I achieve this in PyTorch?

For example, my JSON file looks like this:
“name”: “Arch 1”,
“batch_size”: 50,
“layers”: [
“type”: “Input”,
“output_shape”: [
“type”: “Conv2D”,
“num_filters”: 32,
“filter_size”: [
“non_linearity”: “rectify”,
“conv_mode”: “same”
“type”: “MaxPool2D”,
“filter_size”: [

If this is exported from some other program, if you can export it in onnx or caffe format, my understanding is that pytorch has native importers for one or both of these types.

otherwise, if it’s a proprietary format description, you can write code to convert your json format into a python object hierarchy (json.loads(json_string)), then iterate over it.

you can do something like:

class MyNetwork(nn.Module):
    def __init__(self, net_string):
        self.module_list = nn.ModuleList()
        for layer_def in json.loads(net_string):
            layer = self._create_layer(layer_def)

    def forward(self, x):
        for layer in self.module_list:
            x = layer(x)
        return x

(edit: ie, nn.ModuleList is quite useful for such dynamic layer lists)

And what about the non linearities?
Can you please explain with an example, as MaxPooling and Relu is applied in forward function on the example given in Official documentation

I mean, conceptually, you can add any network module to a module list. The only example I have that is opensource is unfortunately in C++, and not for pytorch, https://github.com/hughperkins/DeepCL/blob/master/src/netdef/NetdefToNet.cpp#L104

you’re going to make _create_layer have maybe an if statement like:

if layer_def['type'] == 'relu':
    return nn.ReLU()
elif layer_def['type'] == 'conv':
    return [stuff here to create a conv layer]
elif ... etc ...

I have one more doubt.
Say I have a dense/linear layer with say(1,1,3136) units.

How do I reshape this for Conv Layer (64,7,7) and feed this input to the Transpose Conv Layer?

I am trying to create a Convolutional Autoencoder actually

You could write a custom layer which does nothing except reshaping and add this layer in front of your convolutional layer:

class ReshapeLayer(torch.nn.Module):
    def __init__(batch_size, n_channels, height, width):
        self.shape = [batch_size, n_channels, height, width] 
    def forward(self, x):
        return x.view(*self.shape)

EDIT: if you don’t know all of that dimension sizes you could also pass -1 for one dimension. With -1 the dimension is set in a way that it fits to the number of entries in the tensor and is calculated with respect to the other dimensions

How should I flatten the data before feeding to FC layer , i.e The other way round.
Will this work? Is it necessary to create init method ?

class Flatten(nn.Module):
def forward(self, x):
x = x.view(x.size()[0], -1)
return x

An init method is necessary because torch.nn.Module is an abstract class with the init being an abstract method and subclasses of abstract classes should implement all abstract methods.

Despite that fact the rest of your class should work the way you want it to