I have some conv nn and set np.random.seed manually, based on which I later fill in my starting weights of conv and fully-connected layers. To have everything deterministic. And it works. But then I added two MaxPool2d layers which I thought should be deterministic but turns out one of them is not. Basically these ar emy conv layers:
I guess you are using cudnn? If so the default maxpooling algorithm is known not to be deterministic.
You can use torch.backends.cudnn.deterministic = True to force it to use a deterministic algorithm (at a performance cost).
If I remember correctly, the computation of the max itself is deterministic, but the backward pass can be implemented much more efficiently if you accept non-deterministic output (which is what cudnn is doing by default).