Hello !
I am in the process of writing a library to emulate stochastic hardware in Pytorch. A simple statement allows me to turn on/off the stochasticity. However, Pytorch code get executed before this simple statement, even though they are declared in the right order, i.e.:
manager.random_off() # get executed second
my_module(tensor) # get executed first
All of this happens in the context of a small test script, on CPU. A few prints helped me to notice it happening. I’m not sure what is happening here, I’m a taker for any help/input/pointer ! Thanks.
Could you simplify and trim down your code as much as possible
while still having it display this behavior? Then post a self-contained,
runnable script so that we can try to reproduce your issue. Also, do
tell us what version of pytorch you are using. For example, you can
include print (torch.__version__) in your script.
Hi Frank!
My apologies, after booting my PC this morning the issue is not willing to repeat itself. I am using Torch v.1.8.1+cu102 on WSL2 for reference.