Optim.Adam, LR in tensor

Which implementation support tensor for LR

  • lr (float, Tensor, optional) – learning rate (default: 1e-3). A tensor LR is not yet supported for all our implementations. Please use a float LR if you are not also specifying fused=True or capturable=True.

No, really; I wanted to use tensor LR, but the documentation says

A tensor LR is not yet supported for all our implementations.

Then, I wondered if any implementation supports tensor LR?! If so, which one?