I am trying to create a general feedforward net with CNN and FNN layers. What I am trying to do is: depending on the parameters that are passed in the class, then I want to create fully connected layers according to how config is specified.
Would this be a good way of doing it inside a class EncodingNetwork(nn.Module)?
for num_units, dropout_params, weight_decay in zip(
fc_layer_params, dropout_layer_params, weight_decay_params):
kernal_regularizer = None
if weight_decay is not None:
kernal_regularizer = ??
layers.append(
nn.Linear(
num_units,
activation=activation_fn,
kernel_initializer=kernel_initializer,
kernel_regularizer=kernal_regularizer,
dtype=dtype))
Also, I am confused as to how I should specify L2 regularization in kernal_regularizer. I couldn’t find a way to do this in PyTorch, but for sure there is one.