Federated learning is a method of distributed machine learning based on privacy protection, which is characterized by a large number of heterogeneous clients, and often these clients have great differences in communication and computation capabilities, such as servers and smartphones. In this work, we propose a new federated learning algorithm named AAFL, which is based on asynchronous communication and can adaptively assign training models based on the capabilities of different clients to solve the communication efficiency and privacy problems caused by a large number of heterogeneous clients. Our proposed solution can assign training models with different computational complexity according to clients with different capabilities, and reduce the problem that some clients with poor capabilities slow down the training speed of the global model by means of synchronous communication. AAFL can allocate sub-models of different sizes and still aggregate a single global model. The experimental results show that assigning models of different scales according to the capabilities of different clients and using asynchronous communication can improve the training speed and accuracy of the global model to a certain extent, and has a good effect on protecting the privacy of user data.
Yuchang SunJiawei ShaoYuyi MaoJun Zhang
Xiaolu WangZijian LiShi JinJun Zhang
Jinglong ShenNan ChengZhisheng YinYuchuan FuWenchao XuSong Guo
Syed Saqib AliAjit KumarMazhar AliAnkit Kumar SinghBong Jun Choi