UnboundLocalError: local variable 'beta1' referenced before assignment

Hi there!

I got an error about an local variable referenced before assignment:

Traceback (most recent call last):
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/skrix/mavo/src/deeplink/__main__.py", line 8, in <module>
    main()
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/skrix/mavo/src/deeplink/cli.py", line 785, in run_opt
    run_bayesian_hyperopt(
  File "/home/skrix/mavo/src/deeplink/rgcn_hetero/deployer.py", line 410, in run_bayesian_hyperopt
    study.optimize(func=objective, n_trials=n_trials, timeout=timeout)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/optuna/study.py", line 385, in optimize
    _optimize(
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/optuna/_optimize.py", line 66, in _optimize
    _optimize_sequential(
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/optuna/_optimize.py", line 163, in _optimize_sequential
    trial = _run_trial(study, func, catch)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/optuna/_optimize.py", line 268, in _run_trial
    raise func_err
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/optuna/_optimize.py", line 217, in _run_trial
    value_or_values = func(trial)
  File "/home/skrix/mavo/src/deeplink/rgcn_hetero/deployer.py", line 370, in objective
    trainer.fit(
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 499, in fit
    self.dispatch()
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 546, in dispatch
    self.accelerator.start_training(self)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/accelerators/accelerator.py", line 73, in start_training
    self.training_type_plugin.start_training(trainer)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 114, in start_training
    self._results = trainer.run_train()
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 637, in run_train
    self.train_loop.run_training_epoch()
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/training_loop.py", line 492, in run_training_epoch
    batch_output = self.run_training_batch(batch, batch_idx, dataloader_idx)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/training_loop.py", line 654, in run_training_batch
    self.optimizer_step(optimizer, opt_idx, batch_idx, train_step_and_backward_closure)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/trainer/training_loop.py", line 425, in optimizer_step
    model_ref.optimizer_step(
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/core/lightning.py", line 1390, in optimizer_step
    optimizer.step(closure=optimizer_closure)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/core/optimizer.py", line 214, in step
    self.__optimizer_step(*args, closure=closure, profiler_name=profiler_name, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/core/optimizer.py", line 134, in __optimizer_step
    trainer.accelerator.optimizer_step(optimizer, self._optimizer_idx, lambda_closure=closure, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/accelerators/accelerator.py", line 277, in optimizer_step
    self.run_optimizer_step(optimizer, opt_idx, lambda_closure, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/accelerators/accelerator.py", line 282, in run_optimizer_step
    self.training_type_plugin.optimizer_step(optimizer, lambda_closure=lambda_closure, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 163, in optimizer_step
    optimizer.step(closure=lambda_closure, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/torch/optim/optimizer.py", line 89, in wrapper
    return func(*args, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/home/skrix/.conda/envs/new_deeplink_env/lib/python3.8/site-packages/torch/optim/adamw.py", line 117, in step
    beta1,
UnboundLocalError: local variable 'beta1' referenced before assignment

``

Do you have an idea why? 

Pytorch Version: 1.8.0

Can you provide us a MRE?
It’s hard to understand what’s going on without the code.
Nevertheless, usually you get this error when the variable, beta1 in this case, can’t be computed or initialized and it’s used immediately after.
For instance, let’s suppose you have the following code:

if condition:
    beta1 = torch.something(vector ...)

loss = beta1 * SOME_LOSS_FUNCTION

If the condition is not met, you won’t be able to compute beta1, which is needed to compute the loss variable aswell.
So give a look at your code and check if the variable beta1 can be always computed.

Actually, I am not using beta1 anywhere myself. It is a variable that is set by torch in the AdamW optimizer. This is why I am confused.

Can you post here the code in which you define the optimizer?

Because you just need to pass a tuple of values to AdamW called betas, not beta1, so I’m a bit confused.

Sure, here you go:

    def configure_optimizers(self) -> th.optim.Optimizer:
        # optimizer
        all_params = list(itertools.chain(
            [x for x in self.rgcn.multi_modal_embed.parameters()],
            [x for x in self.rgcn.parameters()],
        ))

        optimizer = th.optim.AdamW(all_params, betas=(0.9, 0.999), lr=self.learning_rate, weight_decay=self.weight_decay)
        return optimizer

I guess for some reason none of the passed parameters have a valid gradient, which will thus break the code. This was a known issue in PyTorch 1.8.0 after a refactoring of the optimizers, which was then fixed in this PR and should be available in 1.8.1 as well as the nightly binaries.
To fix it you could thus update PyTorch or make sure that some of the parameters have a gradient.

2 Likes

Hi, @ptrblck. My torch verision is 1.8.1+cu111. Why I still met this error? Thanks.

The PR might not have been picked into 1.8.1 (you could check it by searching for the commit in the v1.8.1 branch). As aquick check, update to the latest release and rerun your code.

1 Like

Python doesn’t have variable declarations , so it has to figure out the scope of variables itself. It does so by a simple rule: If there is an assignment to a variable inside a function, that variable is considered local . The unboundlocalerror: local variable referenced before assignment is raised when you try to use a variable before it has been assigned in the local context.

Python has lexical scoping by default, which means that although an enclosed scope can access values in its enclosing scope, it cannot modify them (unless they’re declared global with the global keyword). All variable assignments in a function store the value in the local symbol table; whereas variable references first look in the local symbol table, then in the global symbol table, and then in the table of built-in names. Thus, global variables cannot be directly assigned a value within a function (unless named in a global statement), although they may be referenced.