How does pytorch handle backward propagation with torch.round()?

The torch.round() function is not continuous. How is the gradient calculation handled with it? Are there any references I can look into?

This might help:
https://pytorch.org/docs/stable/notes/autograd.html#gradients-for-non-differentiable-functions