Getting the gradients of global_model with .grad

Hello, i am training a CNN model in federated learning settings. i update my local_models and then i average the weights i get from the local_model update to get the weight of my global_model. Since i use loss.backward() during my local_model training, i can get the gradients by using :

updates_list = []
for item in model.parameters():
    updates_list.append(copy.deepcopy(item.grad))
print(updates_list)

But i can not use .grad for my global_model. My question is, how do i get the updated gradients of my global_model?

I’m not sure, if I understand the use case correctly, but in case you would like to use the gradients in updates_list, you could assign these values to the .grad attributes of the parameters of the global_model and call optimizer.step() afterwards.