Hello, i am training a CNN model in federated learning settings. i update my local_models
and then i average the weights
i get from the local_model
update to get the weight of my global_model
. Since i use loss.backward()
during my local_model
training, i can get the gradients by using :
updates_list = []
for item in model.parameters():
updates_list.append(copy.deepcopy(item.grad))
print(updates_list)
But i can not use .grad
for my global_model
. My question is, how do i get the updated gradients of my global_model
?