i have a model that is wrapper within a ddp (DistributedDataParallel).
what is the right way to access to all model attributes?
i recall i had similar issue with
in a ddp, the model is stored in
so far, i use
is there a better way? because i have to go through entire code to change this…
also, i want to make sure that i am using the right model because i need to get other things from it: grad, and other stuff from some layers.
is it safe to use the same trick for dataparallel ? i want to avoid
if isinstance(model, ddp) / isinstance(model, MYMODEL) as well.
class myDDP(torch.nn.parallel.DistributedDataParallel): def __getattr__(self, name): try: return super().__getattr__(name) except AttributeError: return getattr(self.module, name)
this goes with the risk of undetected name clash… until later.