Is there something similar to tf.keras.backend.epsilon() in pytorch? instead of explicitly define a very small number each time, we can use something like torch.epsilon() for instance.
AFAIK, PyTorch does not have anything similar epsilon as you mention above.
IMO, it is good practice to mention our own epsilon value in the program and not depend on the framework.
Just do something like
at some where in your codebase and use
eps throughout… I don’t see why you would need one provided from pytorch core lib.
I agree that this would be overkill. It’s literally just
@tf_export('keras.backend.epsilon') def epsilon(): """Returns the value of the fuzz factor used in numeric expressions. Returns: A float. """ return _EPSILON
_EPSILON = 1e-7.
Thanks, you are right. I have done exactly something like
eps=1e-10, I just thought maybe there is something which I am not aware of.