Nguyen Thị Thanh VanNguyen Cong LuongHuy T. NguyenShaohan FengDusit NiyatoDong In Kim
Federated Learning (FL) as a promising technique is able to address the privacy issues in machine learning. However, due to the broadcast nature of wireless channel, one of the key challenges of FL is its vulnerability to wireless security threats. Thus, in this paper, we consider the model update security in FL. In particular, we propose to adopt a covert communication technique with which a friendly jammer transmits jamming signals to prevent a warden from detecting local model update transmissions of mobile devices in FL. The use of jamming signals reduces the transmission rate of the devices. Thus, we formulate an optimization problem that jointly determines the jamming power, local model transmission power, and local training accuracy to minimize the FL latency, given a security performance requirement. The problem is non-convex, and we propose an alternating descent algorithm to solve it. Extensive simulations are conducted and the results demonstrate the effectiveness and network performance improvement of the proposed algorithm.
Xiangwang HouJingjing WangChunxiao JiangXudong ZhangYong RenMérouane Debbah
Jie GuoCe XuYushi LingYuan LiuQi Yu
Chao WangZehui XiongChengwen XingNan ZhaoDusit NiyatoGeorge K. Karagiannidis
Xuhui ZhangWenchao LiuJinke RenHuijun XingGui GuiYanyan ShenShuguang Cui