Convert tf regularization to pytorch

I am converting a tf model to pytorch:

import tensorflow.contrib.slim as slim

def net(posenet_inputs):
    with slim.arg_scope([slim.conv2d],
                         normalizer_fn=slim.batch_norm,
                         normalizer_params=batch_norm_params,
                         weights_regularizer=slim.l2_regularizer(scale=0.0001),
                         activation_fn=tf.nn.relu):
        conv1  = slim.conv2d(posenet_inputs, 16,  7, 2)
        conv2  = slim.conv2d(conv1, 32,  5, 2)
        ...

I have written the basic network in pytorch.
I wanted to implement the exact same regularization for weights in pytorch. I found that changing weight_decay in the AdamOptimizer can add regularization, but I don’t think that would be exactly recreating the TF implementation.

Is there any other way that would similarly use the scale parameter?