TY - JOUR
T1 - Dynamic service migration and workload scheduling in edge-clouds
AU - Urgaonkar, Rahul
AU - Wang, Shiqiang
AU - He, Ting
AU - Zafer, Murtaza
AU - Chan, Kevin
AU - Leung, Kin K.
N1 - Funding Information:
Ting He received the B.S. degree in computer science from Peking University, China, in 2003 and the Ph.D. degree in electrical and computer engineering from Cornell University, Ithaca, NY, in 2007. In 2007, Ting joined the IBM T.J. Watson Research Center, where she is currently a Research Staff Member in the Network Analytics Research Group. At IBM, she has worked as a primary researcher and task lead in several research programs including the International Technology Alliance (ITA) program funded by US ARL and UK MoD, the ARRA program funded by NIST, and the Social Media in Strategic Communication (SMISC) program funded by DARPA. Her work is in the broad areas of network modeling, statistical inference, and information theory. He is a senior member of IEEE. She has served as the Membership co-chair of ACM N2Women and the TPC of a range of communications and networking conferences, including IEEE INFOCOM, IEEE SECON, IEEE/ACM IWQoS, IEEE MILCOM, IEEE ICNC, IFIP Networking, etc. She received the Outstanding Contributor Award from IBM Research in 2009 and 2013. She received the Best Paper Award at the 2013 International Conference on Distributed Computing Systems (ICDCS), a Best Paper Nomination at the 2013 Internet Measurement Conference (IMC), and the Best Student Paper Award at the 2005 International Conference on Acoustic, Speech and Signal Processing (ICASSP). In school, she was an Outstanding College Graduate of Beijing Area and an Outstanding Graduate of Peking University in 2003, and a winner of the Excellent Student Award of Peking University from 1999 to 2002.
Funding Information:
This research was sponsored in part by the US Army Research Laboratory and the UK Ministry of Defence and was accomplished under Agreement Number W911NF-06-3-0001 . The views and conclusions contained in this document are those of the author(s) and should not be interpreted as representing the official policies, either expressed or implied, of the US Army Research Laboratory, the US Government, the UK Ministry of Defence or the UK Government. The US and UK Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.
Publisher Copyright:
© 2015 Elsevier B.V.
Copyright:
Copyright 2015 Elsevier B.V., All rights reserved.
PY - 2015/9/1
Y1 - 2015/9/1
N2 - Abstract Edge-clouds provide a promising new approach to significantly reduce network operational costs by moving computation closer to the edge. A key challenge in such systems is to decide where and when services should be migrated in response to user mobility and demand variation. The objective is to optimize operational costs while providing rigorous performance guarantees. In this paper, we model this as a sequential decision making Markov Decision Problem (MDP). However, departing from traditional solution methods (such as dynamic programming) that require extensive statistical knowledge and are computationally prohibitive, we develop a novel alternate methodology. First, we establish an interesting decoupling property of the MDP that reduces it to two independent MDPs on disjoint state spaces. Then, using the technique of Lyapunov optimization over renewals, we design an online control algorithm for the decoupled problem that is provably cost-optimal. This algorithm does not require any statistical knowledge of the system parameters and can be implemented efficiently. We validate the performance of our algorithm using extensive trace-driven simulations. Our overall approach is general and can be applied to other MDPs that possess a similar decoupling property.
AB - Abstract Edge-clouds provide a promising new approach to significantly reduce network operational costs by moving computation closer to the edge. A key challenge in such systems is to decide where and when services should be migrated in response to user mobility and demand variation. The objective is to optimize operational costs while providing rigorous performance guarantees. In this paper, we model this as a sequential decision making Markov Decision Problem (MDP). However, departing from traditional solution methods (such as dynamic programming) that require extensive statistical knowledge and are computationally prohibitive, we develop a novel alternate methodology. First, we establish an interesting decoupling property of the MDP that reduces it to two independent MDPs on disjoint state spaces. Then, using the technique of Lyapunov optimization over renewals, we design an online control algorithm for the decoupled problem that is provably cost-optimal. This algorithm does not require any statistical knowledge of the system parameters and can be implemented efficiently. We validate the performance of our algorithm using extensive trace-driven simulations. Our overall approach is general and can be applied to other MDPs that possess a similar decoupling property.
UR - http://www.scopus.com/inward/record.url?scp=84939262501&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84939262501&partnerID=8YFLogxK
U2 - 10.1016/j.peva.2015.06.013
DO - 10.1016/j.peva.2015.06.013
M3 - Article
AN - SCOPUS:84939262501
SN - 0166-5316
VL - 91
SP - 205
EP - 228
JO - Performance Evaluation
JF - Performance Evaluation
M1 - 1828
ER -