C++ Pytorch API runs very slow on Windows

I have been using the C++ frontend API for a few days now but noticed a very drastic slow down in inference timings between Linux and Windows. On Windows a simple forward pass through my model can take many seconds in debug mode and up to a second in release mode. On my Ubuntu system timing isn’t even an issue I took into consideration because it only takes a few milliseconds for the forward pass to complete. Is this something common? Are there any obvious reasons which cause this sort of behavior? I am using the same exact code on both the Linux and Windows build and it was taken from this page: Installing C++ Distributions of PyTorch — PyTorch master documentation