Train.py: error: unrecognized arguments: False

Hello everyone, I’m new to pytorch.

I’m trying to run a code available on github.
When I run the line that allows me to do the training as indicated on the code page by doing:

python train.py --config configs/pretrain.txt --distributed False

I get the following error:

python train.py --config configs/pretrain.txt --distributed False
usage: train.py [-h] [--config CONFIG] [--rootdir ROOTDIR] [--expname EXPNAME]
                [--distributed] [--local_rank LOCAL_RANK] [-j N]
                [--train_dataset TRAIN_DATASET]
                [--dataset_weights DATASET_WEIGHTS [DATASET_WEIGHTS ...]]
                [--train_scenes TRAIN_SCENES [TRAIN_SCENES ...]]
                [--eval_dataset EVAL_DATASET]
                [--eval_scenes EVAL_SCENES [EVAL_SCENES ...]]
                [--testskip TESTSKIP] [--sample_mode SAMPLE_MODE]
                [--center_ratio CENTER_RATIO] [--N_rand N_RAND]
                [--chunk_size CHUNK_SIZE] [--coarse_feat_dim COARSE_FEAT_DIM]
                [--fine_feat_dim FINE_FEAT_DIM]
                [--num_source_views NUM_SOURCE_VIEWS]
                [--rectify_inplane_rotation] [--coarse_only]
                [--anti_alias_pooling ANTI_ALIAS_POOLING] [--no_reload]
                [--ckpt_path CKPT_PATH] [--no_load_opt] [--no_load_scheduler]
                [--n_iters N_ITERS] [--lrate_feature LRATE_FEATURE]
                [--lrate_mlp LRATE_MLP]
                [--lrate_decay_factor LRATE_DECAY_FACTOR]
                [--lrate_decay_steps LRATE_DECAY_STEPS]
                [--N_samples N_SAMPLES] [--N_importance N_IMPORTANCE]
                [--inv_uniform] [--det] [--white_bkgd]
                [--render_stride RENDER_STRIDE] [--i_print I_PRINT]
                [--i_img I_IMG] [--i_weights I_WEIGHTS] [--llffhold LLFFHOLD]

The code : IBRNet/train.py at master · googleinterns/IBRNet · GitHub

Any help ?

I guess the --distributed argument is used as a switch and doesn’t expect another bool argument.
Try to use:

python train.py --config configs/pretrain.txt --distributed