Well, I wouldn’t say it was “working fine earlier.” Rather, Normal
was failing to complain about an invalid negative value for scale
(sigma) in some situations.
Speaking from memory, pytorch used to complain about negative scale in some, but not all situations, but the error checking has
been tightened up in more recent versions.
Also, from memory, when instantiated with negative scale, Normal
behaved (more or less) as if it had been instantiated as:
torch.distributions.Normal (0, abs (scale))
You could conceivably use abs() as a work-around to recover the
old behavior, but be cautious about sweeping the likely error under
the rug – you should be figuring out where your negative scale is
coming from and whether it really makes mathematical sense.