Trying to convert keras model to pytorch

I am trying to convert a GAN from Keras to Pytorch but I’m not entirely sure how to do so. The two models below is what I want to convert:

tf.keras.Sequential([
        
                             
  tf.keras.layers.Dense(
      1024, None,
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      input_shape=(64,), name = 'Dense_Layer_1',),
  tf.keras.layers.BatchNormalization(
      momentum=0.999, epsilon=0.001, name = 'Batch_Normalization_1',),
  tf.keras.layers.ReLU(name = 'ReLU_Activation_1',),
  tf.keras.layers.Dense(
      12544, None,
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight), name = 'Dense_Layer_2',),
  tf.keras.layers.BatchNormalization(
      momentum=0.999, epsilon=0.001, name = 'Batch_Normalization_2',),
  tf.keras.layers.ReLU(name = 'ReLU_Activation_2',),
  tf.keras.layers.Reshape([7, 7, 256], name = 'Reshape_Layer',),
  tf.keras.layers.Convolution2DTranspose(
      64, [4, 4], strides=[2, 2], 
      activation=tf.nn.relu, padding='same',
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight), name = '2D_Convolution_Transpose_1',),
  tf.keras.layers.Convolution2DTranspose(
      64, [4, 4], strides=[2, 2], 
      activation=tf.nn.relu, padding='same',
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight), name = '2D_Convolution_Transpose_2',),
  tf.keras.layers.Convolution2D(
      1, [4, 4], strides=[1, 1], 
      activation='tanh', padding='same',
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight), name = '2D_Convolution',),
])
tf.keras.Sequential([
        
  tf.keras.layers.Convolution2D(
      64, [4, 4], strides=[2, 2], 
      activation='tanh', padding='same',
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      input_shape=(28,28,1)),
  tf.keras.layers.LeakyReLU(),
  tf.keras.layers.Convolution2D(
      128, [4, 4], strides=[2, 2], 
      activation='tanh', padding='same',
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight)),
  tf.keras.layers.LeakyReLU(),
  tf.keras.layers.Flatten(),
  tf.keras.layers.Dense(
      1024, None,
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight)),
  tf.keras.layers.BatchNormalization(
      momentum=0.999, epsilon=0.001, ),
  tf.keras.layers.LeakyReLU(),
  tf.keras.layers.Dense(
      1, None,
      kernel_initializer=tf.keras.initializers.glorot_uniform(),
      kernel_regularizer=tf.keras.regularizers.l2(l=l2_weight),
      bias_regularizer=tf.keras.regularizers.l2(l=l2_weight)),

])

The part where all the layers are defined makes sense to me, but I have no clue how to approach defining the forward pass in PyTorch.

Once you’ve created all modules in the __init__ method, you can directly call them in the forward method to create the forward pass.
The CIFAR10 tutorial shows how to create a simple CNN and how these layers are used in the forward method. Let us know, if you get stuck. :wink: