PyTorch gives deterministic result inside a file, but different between files

This is also expected as described in this post as the _BaseDataLoaderIter is creating a base_seed by calling into the PRNG.

It’s not unwanted behavior as it’s the expected behavior from any PRNG. Creating the same value for specific calls, such as torch.randn(10), would not be random anymore and the PRNG would be broken. Each call into it needs to increase its offset to guarantee values are random.

Depending on this behavior is indeed not trivial so you would either need to rerun exactly the same code or would re-seed the code risking numbers might be repeated e.g. if you are reusing the same seed.