Set a specific layer/operator to use non-deterministic mode


I try to use torch.use_deterministic_algorithms(True), but some layers in my network does not support determinism. Is there a way to set those layers that do not support determinism to work in non-deterministic mode, while the other layers in deterministic mode?


No, I don’t think it’s possible to pick determinism for each module and use_deterministic_algorithms enables determinism for all used algorithms (or raises a warning/error if that’s not possible).

1 Like