Compare weights of models ddp after backward

I am using DDP with 4 GPUs on a single node. I want to print and compare the weights of the 4 models after the syncing backward() call to prove to myself that they are all equal after the backward(). FYI I only do a backward() that syncs to the other model ranks in the last training iteration of a batch. (All other backward() calls are using the no_sync() context).

My idea is to print this at the top of the validation function, not after the backward() statement in the training function in case there is a timing issue with the 4 processes doing their updates asynchronously.

At the top of the validation function print the sum of the model weights and the GPU rank ID.

I am borrowing a function from here…

model = models.resnet18()

params =
for param in model.parameters():
params.append(param.view(-1))
params = torch.cat(params)
print(params.shape)

torch.Size([11689512])

I only want the Params where it’s a weight param, not a bias param.
model.parameters() will return all trainable parameters so also the bias params, if available.
To filter the weights I could use model.named_parameters() and filter for “weight” in each parameter’s name.

Do you suspect this is the correct method to check that the 4 rank models all have the same weights after doing a backward syncing update at the end of the epoch?