时 间:2024年5月31日14:00 - 15:00
地 点:普陀校区理科大楼A1114
报告人:伍书缘 上海财经大学 助理研究员
主持人:俞锦炯 华东师范大学助理教授
摘 要:
We study a fully decentralized federated learning algorithm, which is a novel gradient descent algorithm executed on acommunication-based network. For convenience, we refer to it as a network gradient descent (NGD) method. In the NGD method, only statistics (e.g., parameter estimates) need to be communicated, minimizing the risk of privacy. Meanwhile, different clients communicate with each other directly according to a carefully designed network structure without a central master. This greatly enhances the reliability of the entire algorithm. Those nice properties inspire us to carefully study the NGD methodboth theoretically and numerically. Theoretically, we start with a classical linear regression model. We find that both the learning rate and the network structure play significant roles in determining the NGD estimator's statistical efficiency. The resulting NGD estimator can be statistically as efficient as the global estimator, if the learning rate is sufficiently small and the network structure is well balanced, even if the data are distributed heterogeneously. Those interesting findings are then extended to general models andloss functions. Extensive numerical studies are presented to corroborate our theoretical findings. Classical deep learning models are also presented for illustration purpose.
报告人简介:
伍书缘博士,现任上海财经大学统计与管理学院助理研究员,曾在北京大学光华管理学院获得统计学博士学位,师从王汉生教授。她的研究主要集中在再抽样方法、分布式计算、统计优化算法、大规模数据统计建模等领域,研究论文发表在JRSSB, Journal of Business & Economic Statistics, Statistica Sinica等期刊。