Why VAE Encoder outputs log variance and not standard deviation?

When talking about VAE (and viewing VAE implementations), the Encoder outputs:

μ, log(variance)

when we train the model (the part of the decoder model), we convert the log(variance) to Standard deviation:

std = exp(0.5 * logvar)

(I took the example from here: PyTorch-VAE/vanilla_vae.py at master · AntixK/PyTorch-VAE · GitHub)

If we need to convert the log(variance) to Standard deviation, why won’t we output the Standard deviation from the encoder instead of making calculation to convert it to Standard deviation ?