TY - GEN
T1 - Online Federated Multitask Learning
AU - Li, Rui
AU - Ma, Fenglong
AU - Jiang, Wenjun
AU - Gao, Jing
N1 - Funding Information:
The authors thank the staff at Purdue University Birck Nanotechnology Center and the Ziaie Biomedical Microdevices Laboratory members for their assistance in fabrica
Funding Information:
ACKNOWLEDGEMENTS This work is sponsored by NSF IIS-1553411. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. REFERENCES
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - With the popular use of mobile devices, it becomes increasingly important to conduct analysis on distributed data collected from multiple devices. Federated learning is a distributed learning framework which takes advantage of the training data and computational ability of scattered mobile devices to learn prediction models, and multi-task learning infers personalized but shared models among devices. Some recent work has integrated federated and multi-task learning, but such approaches may be impractical and inefficient in the online scenario, e.g., when new mobile devices keep joining the mobile computing system. To address this challenge, we propose OFMTL, an online federated multi-task learning algorithm, which learns the model parameters for the new device without revisiting the data of existing devices. The model parameters are derived by an effective way that combines the information inferred from local data and information borrowed from existing models. Through extensive experiments on three real datasets, we show that the proposed OFMTL framework achieves comparable accuracy to the existing algorithms but with much smaller computation, transmission and storage cost.
AB - With the popular use of mobile devices, it becomes increasingly important to conduct analysis on distributed data collected from multiple devices. Federated learning is a distributed learning framework which takes advantage of the training data and computational ability of scattered mobile devices to learn prediction models, and multi-task learning infers personalized but shared models among devices. Some recent work has integrated federated and multi-task learning, but such approaches may be impractical and inefficient in the online scenario, e.g., when new mobile devices keep joining the mobile computing system. To address this challenge, we propose OFMTL, an online federated multi-task learning algorithm, which learns the model parameters for the new device without revisiting the data of existing devices. The model parameters are derived by an effective way that combines the information inferred from local data and information borrowed from existing models. Through extensive experiments on three real datasets, we show that the proposed OFMTL framework achieves comparable accuracy to the existing algorithms but with much smaller computation, transmission and storage cost.
UR - http://www.scopus.com/inward/record.url?scp=85081352262&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081352262&partnerID=8YFLogxK
U2 - 10.1109/BigData47090.2019.9006060
DO - 10.1109/BigData47090.2019.9006060
M3 - Conference contribution
AN - SCOPUS:85081352262
T3 - Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019
SP - 215
EP - 220
BT - Proceedings - 2019 IEEE International Conference on Big Data, Big Data 2019
A2 - Baru, Chaitanya
A2 - Huan, Jun
A2 - Khan, Latifur
A2 - Hu, Xiaohua Tony
A2 - Ak, Ronay
A2 - Tian, Yuanyuan
A2 - Barga, Roger
A2 - Zaniolo, Carlo
A2 - Lee, Kisung
A2 - Ye, Yanfang Fanny
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE International Conference on Big Data, Big Data 2019
Y2 - 9 December 2019 through 12 December 2019
ER -