MC dropout - strange behavior

Hi,

I’ve noticed a strange behavior when sampling my network with MC dropout. Currently, I am using T=100 stochastic forward passes and sometimes observe a very low epistemic uncertainty (variance in the prediction over the T samples) and sometimes a very high. Not very strange maybe.

What is strange is that when I try to repeat 10 such samplings (each iteration: 100 samples on the same input) then each iteration will either only produce low epistemic uncertainty or only high (they typically always surround the same value) over the 10 repeats. I would rather expect them to be mixed or yield something in between.

I try to figure out why this is and how I can solve this, but am pretty clueless…

Tried different seeds but the same pattern keeps occurring.

Any tips or suggestions?