Facing problems with Detectron2 Lazy Config (.py) file

Hey. So, I want to use a config file from a detectron2 project which is under MViTv2 in my custom dataset which is “my_dataset_train” (coco (datacatalog format)

I want to use this config file
[https://github.com/facebookresearch/detectron2/blob/main/projects/MViTv2/configs/mask_rcnn_mvitv2_t_3x.py]

like the beneath typical way I use a yaml config file. But giving errors about trainer when i implement it on lazy config.

I figured some of the config. Here is the structure.


import logging
import sys

from detectron2.checkpoint import DetectionCheckpointer
from detectron2.config import LazyConfig, instantiate
from detectron2.engine import (
    AMPTrainer,
    SimpleTrainer,
    DefaultTrainer,
    default_argument_parser,
    default_setup,
    default_writers,
    hooks,
    launch,
)
# from detectron2.engine.defaults import create_ddp_model
from detectron2.evaluation import inference_on_dataset, print_csv_format
from detectron2.utils import comm


cfg = LazyConfig.load("/kaggle/working/detectron2/projects/MViTv2/configs/mask_rcnn_mvitv2_t_3x.py")
cfg.dataloader.train.dataset.names='my_dataset_train'
cfg.optimizer = model_zoo.get_config("common/optim.py")

optim = instantiate(cfg.optimizer)

# train_loader = instantiate(cfg.dataloader.train)
# dataset=instantiate(cfg.dataloader.train)
model = instantiate(cfg.model)
model.to(cfg.train.device)
# cfg.train.amp.enabled=True
cfg.train.max_iter = 100
cfg.optimizer.lr = 0.001
cfg.dataloader.train.total_batch_size = 8
cfg.train.init_checkpoint = "detectron2://ImageNetPretrained/mvitv2/MViTv2_T_in1k.pyth"

# trainer = SimpleTrainer(model, train_loader, optim)
trainer=DefaultTrainer(cfg)

print(model)

trainer.train()

Error or Issue: The issue is with trainer. The AMP trainer is somewhat for parallel distributed system. And it is kinda tough for me to figure it out. But *

default trainer is telling me to add solver but solver is not for lazy config.

Can you guys help me how to run it in single gpu? By default it says 64 gpus.