LBFGS Optimizer is non-deterministic?

LBFGS Optimizer:

I was wondering if this optimizer is supposed to produce non-deterministic results? Given that these results were produced under the same parameter initialization and optimization setting. Simply running the program multiple times without any changes would produce different results.

I have set the following to attempt disabling the non-deterministic outcome but no luck:
np.random.seed(0)
torch.manual_seed(0)
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False

Hi,

You can check the note on reproducibility in the doc here: https://pytorch.org/docs/stable/notes/randomness.html

There are many reasons that could lead to it (or any other part of your code) to be non deterministic.