Is there something similar to tf.keras.backend.epsilon() in pytorch? instead of explicitly define a very small number each time, we can use something like torch.epsilon() for instance.
1 Like
No.
AFAIK, PyTorch does not have anything similar epsilon as you mention above.
IMO, it is good practice to mention our own epsilon value in the program and not depend on the framework.
Just do something like
eps=1e-10
at some where in your codebase and use eps
throughout… I don’t see why you would need one provided from pytorch core lib.
4 Likes
I agree that this would be overkill. It’s literally just
@tf_export('keras.backend.epsilon')
def epsilon():
"""Returns the value of the fuzz factor used in numeric expressions.
Returns:
A float.
"""
return _EPSILON
(Source: https://github.com/tensorflow/tensorflow/blob/r1.12/tensorflow/python/keras/backend.py)
where _EPSILON = 1e-7
.
3 Likes
Thanks, you are right. I have done exactly something like eps=1e-10
, I just thought maybe there is something which I am not aware of.
torch.finfo(torch.float32).eps
14 Likes