Change h finite diff step size for grad calc

I’m using Pytorch on data which is much noisier than normal image data, and I want to change the step size (h or eps) which autograd uses in the finite difference calc:
grad = f(x + h) - f(x) / h

I read somewhere the default eps or h = 1e-6, I want to use 1e-3 or -4.
I think I could override backwards() in my layer (which is a convolution which subclasses torch.nn.Conv1d) but defining my own backwards() seems overkill when I only want to change a step size … then backwards() would be a bit strange, it would define a finite diff calc when generally I believe backwards() is used to replace the finite diff calc (eg. by providing an analytical function).
Note: I’m using a small CNN with SGD or Adam.

whats the best way to do this?

Hi,

Pytorch does not use finite difference during the backward pass but automatic differentiation (a chain of analytical functions).
Only the gradcheck tool we use to test our backward pass actually does finite difference. But this is only used for testing purposes.