Long-term memory networks for question answering

Fenglong Ma, Radha Chitta, Saurabh Kataria, Jing Zhou, Palghat Ramesh, Tong Sun, Jing Gao

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations


Question answering is an important and difficult task in the natural language process-ing domain, because many basic natural lan-guage processing tasks can be cast into a ques-tion answering task. Several deep neural net-work architectures have been developed re-cently, which employ memory and inference components to memorize and reason over text information, and generate answers to ques-tions. However, a major drawback of many such models is that they are capable of only generating single-word answers. In addition, they require large amount of training data to generate accurate answers. In this paper, we introduce the Long-Term Memory Network (LTMN), which incorporates both an exter-nal memory module and a Long Short-Term Memory (LSTM) module to comprehend the input data and generate multi-word answers. The LTMN model can be trained end-to-end using back-propagation and requires minimal supervision. We test our model on two syn-thetic data sets (based on Facebook's bAbI data set) and the real-world Stanford ques-tion answering data set, and show that it can achieve state-of-the-art performance.

Original languageEnglish (US)
Pages (from-to)7-14
Number of pages8
JournalCEUR Workshop Proceedings
StatePublished - 2017
Event2017 IJCAI Workshop on Semantic Machine Learning, SML 2017 - Melbourne, Australia
Duration: Aug 20 2017 → …

All Science Journal Classification (ASJC) codes

  • General Computer Science


Dive into the research topics of 'Long-term memory networks for question answering'. Together they form a unique fingerprint.

Cite this