I was wondering if this optimizer is supposed to produce non-deterministic results? Given that these results were produced under the same parameter initialization and optimization setting. Simply running the program multiple times without any changes would produce different results.
I have set the following to attempt disabling the non-deterministic outcome but no luck: np.random.seed(0) torch.manual_seed(0) torch.backends.cudnn.deterministic = True torch.backends.cudnn.benchmark = False