Access to attributes of model wrapped in DDP

hi,
i have a model that is wrapper within a ddp (DistributedDataParallel).

what is the right way to access to all model attributes?
i recall i had similar issue with DataParallel.

in a ddp, the model is stored in ddp.module here.
so far, i use ddp_model.module.attribute.
is there a better way? because i have to go through entire code to change this…
also, i want to make sure that i am using the right model because i need to get other things from it: grad, and other stuff from some layers.

is it safe to use the same trick for dataparallel ? i want to avoid if isinstance(model, ddp) / isinstance(model, MYMODEL) as well.

class myDDP(torch.nn.parallel.DistributedDataParallel):
   def __getattr__(self, name):
        try:
            return super().__getattr__(name)
        except AttributeError:
            return getattr(self.module, name)

this goes with the risk of undetected name clash… until later.

thanks

2 Likes

in a ddp, the model is stored in ddp.module here .
so far, i use ddp_model.module.attribute .

That seems correct since you are accessing a python object attribute.

is it safe to use the same trick for dataparallel ?

Yes, it should be safe since you are accessing the attributes of a python object.

one can also reference the model inside ddp and work on the reference to avoid ddp.modul.attr.
so, the optimizer can work on ddp, but you can do everything else over the reference.
see this.
the example.