Round of floating point value and backpropagation

I have round of the floating point value using below code but its not allow to backpropagate the gradient. Is there anyway to round off value with two decimal point so that it backpropagate the gradient?

 x=torch.FloatTensor([2.123])
 n_digits=2
 x_r = (x * 10**n_digits).round() / (10**n_digits)
 print(x_r)

Does something like a custom round function that actually uses the gradient of the original tensor work? Torch.round() gradient - #6 by Zhongzhi_Yu