While using torch_xla.it shows the following error

518   torch_xla._XLAC._xla_step_marker(
519       torch_xla._XLAC._xla_get_default_device(), [],

– 520 wait=xu.getenv_as(‘XLA_SYNC_WAIT’, bool, False))
521 # Only emit metrics from the first local device index, to avoid emitting the
522 # same values from different threads.

RuntimeError: Failed precondition: From /job:tpu_worker/replica:0/task:0:
The TPU system has not been initialized.
[[{{node XRTCompile}}]]

Hi, for all pytorch/xla (or pytorch on tpu) questions, please open an issue in https://github.com/pytorch/xla.
Thanks!