How to write my own LearningRate function

hey guys
I want to define my own LR
the code in tensorflow is like:

class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):
  def __init__(self, d_model, warmup_steps=4000):
    super().__init__()

    self.d_model = d_model
    self.d_model = tf.cast(self.d_model, tf.float32)

    self.warmup_steps = warmup_steps

  def __call__(self, step):
    step = tf.cast(step, dtype=tf.float32)
    arg1 = tf.math.rsqrt(step)
    arg2 = step * (self.warmup_steps ** -1.5)

    return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2)

how can I write this code in pytorch format (chatGPT’s didn’t work!)

Hi @Farshid1,

You could have a look at the existing LR Schedulers in the docs here and view the source of these LR schedulers as well here. This would you give you a rough idea of how to construct your own custom LR scheduler.