What does torch.utils.bottleneck do on its two runs?

I am trying to do some profiling with torch.utils.bottleneck. I see it gets run twice, once with the line:

Running your script with cProfile

and a second time with

Running your script with the autograd profiler…

The first one runs to completion and the second one crashes. Based on my code my suspicion is that it doesnt reset the entire python call, just runs the script twice. I have some initialization items that might be doing funky things if this is the case, i.e. checking if variables have been initialized so the second time they will have been. Just want to confirm this is how it works before diving into how to ‘uninitialize’ my variables again.