Different learning rate for different class

How to set adaptive learning rate for different class in multi-class classification while training? Like, when some classes’ loss get the threshold, then set their learning rate * 0.1, wile other classes keep their learning rate to not to change.

What do you exactly want to do? LR is inherent to model parameters but it seems you want to modify it depending on your output. A mini-batch may contain different classes so could you in-depth explain it?

You can set a threshold for a class and modify learning rate the mini-batch will be backpropagated with before backpropagating but it still does not solves the fact the batch will contain different classes and they will be backpropagated with the same LR.

You may consider to penalize loss depending on this th, this way gradients will be penalized too.