TY - JOUR
T1 - PicSys
T2 - Energy-Efficient Fast Image Search on Distributed Mobile Networks
AU - Felemban, Noor
AU - Mehmeti, Fidan
AU - Khamfroush, Hana
AU - Lu, Zongqing
AU - Rallapalli, Swati
AU - Chan, Kevin
AU - Porta, Thomas La
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2021/4/1
Y1 - 2021/4/1
N2 - Mobile devices collect a large amount of visual data that are useful for many applications. Searching for an object of interest over a network of mobile devices can aid human analysts in a variety of situations. However, processing the information on these devices is a challenge owing to the high computational complexity of the state-of-the-art computer vision algorithms that primarily rely on Convolutional Neural Networks (CNNs). Thus, this paper builds PicSys, a system that enables answering visual search queries on a mobile network. The objective of the system is to minimize the maximum completion time over all devices while taking into account the energy consumption of mobile devices as well. First, PicSys carefully divides the computation into multiple filtering stages, such that only a small percentage of images need to run the entire CNN pipeline. Splitting such CNN computation into multiple stages requires understanding the intermediate CNN features and systematically trading off accuracy for the computation speed. Second, PicSys determines where to run each of the stages of the multi-stage pipeline to fully utilize the available resources. Finally, through extensive experimentation, system implementation, and simulation, we show that PicSys performance is close to optimal and significantly outperforms other standard algorithms.
AB - Mobile devices collect a large amount of visual data that are useful for many applications. Searching for an object of interest over a network of mobile devices can aid human analysts in a variety of situations. However, processing the information on these devices is a challenge owing to the high computational complexity of the state-of-the-art computer vision algorithms that primarily rely on Convolutional Neural Networks (CNNs). Thus, this paper builds PicSys, a system that enables answering visual search queries on a mobile network. The objective of the system is to minimize the maximum completion time over all devices while taking into account the energy consumption of mobile devices as well. First, PicSys carefully divides the computation into multiple filtering stages, such that only a small percentage of images need to run the entire CNN pipeline. Splitting such CNN computation into multiple stages requires understanding the intermediate CNN features and systematically trading off accuracy for the computation speed. Second, PicSys determines where to run each of the stages of the multi-stage pipeline to fully utilize the available resources. Finally, through extensive experimentation, system implementation, and simulation, we show that PicSys performance is close to optimal and significantly outperforms other standard algorithms.
UR - http://www.scopus.com/inward/record.url?scp=85077381655&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077381655&partnerID=8YFLogxK
U2 - 10.1109/TMC.2019.2963150
DO - 10.1109/TMC.2019.2963150
M3 - Article
AN - SCOPUS:85077381655
SN - 1536-1233
VL - 20
SP - 1574
EP - 1589
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
IS - 4
M1 - 8946323
ER -