Pytorch Equivalent for kernel_regulariser in Tensorflow

I am trying to convert a Tensorflow Code to Pytorch. I am stuck at implementing this line.

Conv3DTranspose(nb_channels, kernel_size=(3, 3, 3), kernel_regularizer=l2(0.001), strides=_strides, padding='same')(y)

How to add a kernel_regulariser in pytorch.

You could either calculate the L2 norm of these weights manually and add it to the loss as seen e.g. here or you could add this parameter with weight_decay to the optimizer.

Thanks a Lot, Peter, a small doubt. I have several layers which need kernel_regularisation. Will it suffice to add just one model.layer.weight parameter??.

No, you would have to add all layers which should be regularized to the loss calculation or to the optimizer’s parameter list containing the weight_decay.

Any Links for the weight Decay Solution??

The Optimizer - Per-parameter options docs give you an example how to pass different parameter groups to an optimizer which can be used to e.g. spcify weight decay only for a specific group.

Thanks a ton, Peter.

Been learning PyTorch since last week. I have a doubt. Why do we have to do everything imperatively in PyTorch as compared to TensorFlow where we just declare what we want and that’s it? Like in case of regularizer, we can just use tf.keras.regularizers.L2.