Sending unchanged parameters in federated learning

Hi everyone, in a federated learning scenario, I am wondering about the global model parameters are sent back to the central node even when they have not been changed during local training? This is also valid in the opposite direction, i.e. sending unchanged global model parameters to the local nodes?