TY - GEN
T1 - Communication-efficient k-means for Edge-based machine learning
AU - Lu, Hanlin
AU - He, Ting
AU - Wang, Shiqiang
AU - Liu, Changchang
AU - Mahdavi, Mehrdad
AU - Narayanan, Vijaykrishnan
AU - Chan, Kevin S.
AU - Pasteris, Stephen
N1 - Funding Information:
This research was partly sponsored by the U.S. Army Research Laboratory and the U.K. Ministry of Defence under Agreement Number W911NF-16-3-0001. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the U.S. Army Research Laboratory, the U.S. Government, the U.K. Ministry of Defence or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.
Publisher Copyright:
© 2020 IEEE
PY - 2020/11
Y1 - 2020/11
N2 - We consider the problem of computing the k-means centers for a large high-dimensional dataset in the context of edge-based machine learning, where data sources offload machine learning computation to nearby edge servers. k-Means computation is fundamental to many data analytics, and the capability of computing provably accurate k-means centers by leveraging the computation power of the edge servers, at a low communication and computation cost to the data sources, will greatly improve the performance of these analytics. We propose to let the data sources send small summaries, generated by joint dimensionality reduction (DR) and cardinality reduction (CR), to support approximate k-means computation at reduced complexity and communication cost. By analyzing the complexity, the communication cost, and the approximation error of k-means algorithms based on state-of-the-art DR/CR methods, we show that: (i) in the single-source case, it is possible to achieve a near-optimal approximation at a near-linear complexity and a constant communication cost, (ii) in the multiple-source case, it is possible to achieve similar performance at a logarithmic communication cost, and (iii) the order of applying DR and CR significantly affects the complexity and the communication cost. Our findings are validated through experiments based on real datasets.
AB - We consider the problem of computing the k-means centers for a large high-dimensional dataset in the context of edge-based machine learning, where data sources offload machine learning computation to nearby edge servers. k-Means computation is fundamental to many data analytics, and the capability of computing provably accurate k-means centers by leveraging the computation power of the edge servers, at a low communication and computation cost to the data sources, will greatly improve the performance of these analytics. We propose to let the data sources send small summaries, generated by joint dimensionality reduction (DR) and cardinality reduction (CR), to support approximate k-means computation at reduced complexity and communication cost. By analyzing the complexity, the communication cost, and the approximation error of k-means algorithms based on state-of-the-art DR/CR methods, we show that: (i) in the single-source case, it is possible to achieve a near-optimal approximation at a near-linear complexity and a constant communication cost, (ii) in the multiple-source case, it is possible to achieve similar performance at a logarithmic communication cost, and (iii) the order of applying DR and CR significantly affects the complexity and the communication cost. Our findings are validated through experiments based on real datasets.
UR - http://www.scopus.com/inward/record.url?scp=85101828364&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101828364&partnerID=8YFLogxK
U2 - 10.1109/ICDCS47774.2020.00062
DO - 10.1109/ICDCS47774.2020.00062
M3 - Conference contribution
AN - SCOPUS:85101828364
T3 - Proceedings - International Conference on Distributed Computing Systems
SP - 595
EP - 605
BT - Proceedings - 2020 IEEE 40th International Conference on Distributed Computing Systems, ICDCS 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 40th IEEE International Conference on Distributed Computing Systems, ICDCS 2020
Y2 - 29 November 2020 through 1 December 2020
ER -