TY - GEN
T1 - Gated attentive-autoencoder for content-aware recommendation
AU - Ma, Chen
AU - Kang, Peng
AU - Wu, Bin
AU - Wang, Qinglong
AU - Liu, Xue
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/1/30
Y1 - 2019/1/30
N2 - The rapid growth of Internet services and mobile devices provides an excellent opportunity to satisfy the strong demand for the personalized item or product recommendation. However, with the tremendous increase of users and items, personalized recommender systems still face several challenging problems: (1) the hardness of exploiting sparse implicit feedback; (2) the difficulty of combining heterogeneous data. To cope with these challenges, we propose a gated attentive-autoencoder (GATE) model, which is capable of learning fused hidden representations of items' contents and binary ratings, through a neural gating structure. Based on the fused representations, our model exploits neighboring relations between items to help infer users' preferences. In particular, a word-level and a neighbor-level attention module are integrated with the autoencoder. The word-level attention learns the item hidden representations from items' word sequences, while favoring informative words by assigning larger attention weights. The neighbor-level attention learns the hidden representation of an item's neighborhood by considering its neighbors in a weighted manner. We extensively evaluate our model with several state-of-the-art methods and different validation metrics on four real-world datasets. The experimental results not only demonstrate the effectiveness of our model on top-N recommendation but also provide interpretable results attributed to the attention modules.
AB - The rapid growth of Internet services and mobile devices provides an excellent opportunity to satisfy the strong demand for the personalized item or product recommendation. However, with the tremendous increase of users and items, personalized recommender systems still face several challenging problems: (1) the hardness of exploiting sparse implicit feedback; (2) the difficulty of combining heterogeneous data. To cope with these challenges, we propose a gated attentive-autoencoder (GATE) model, which is capable of learning fused hidden representations of items' contents and binary ratings, through a neural gating structure. Based on the fused representations, our model exploits neighboring relations between items to help infer users' preferences. In particular, a word-level and a neighbor-level attention module are integrated with the autoencoder. The word-level attention learns the item hidden representations from items' word sequences, while favoring informative words by assigning larger attention weights. The neighbor-level attention learns the hidden representation of an item's neighborhood by considering its neighbors in a weighted manner. We extensively evaluate our model with several state-of-the-art methods and different validation metrics on four real-world datasets. The experimental results not only demonstrate the effectiveness of our model on top-N recommendation but also provide interpretable results attributed to the attention modules.
UR - http://www.scopus.com/inward/record.url?scp=85061698813&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85061698813&partnerID=8YFLogxK
U2 - 10.1145/3289600.3290977
DO - 10.1145/3289600.3290977
M3 - Conference contribution
AN - SCOPUS:85061698813
T3 - WSDM 2019 - Proceedings of the 12th ACM International Conference on Web Search and Data Mining
SP - 519
EP - 527
BT - WSDM 2019 - Proceedings of the 12th ACM International Conference on Web Search and Data Mining
PB - Association for Computing Machinery, Inc
T2 - 12th ACM International Conference on Web Search and Data Mining, WSDM 2019
Y2 - 11 February 2019 through 15 February 2019
ER -