Math Multiple Choice Question Solving and Distractor Generation with Attentional GRU Networks

Neisarg Dave, Riley Bakes, Barton Pursel, C. Lee Giles

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

We investigate encoder-decoder GRU networks with attention mechanism for solving a diverse array of elementary math problems with mathematical symbolic structures. We quantitatively measure performances of recurrent models on a given question type using a test set of unseen problems with a binary scoring and partial credit system. From our findings, we propose the use of encoder-decoder recurrent neural networks for the generation of mathematical multiple-choice question distractors. We introduce a computationally inexpensive decoding schema called character offsetting, which qualitatively and quantitatively shows promise for doing so for several question types. Character offsetting involves freezing the hidden state and top k probabilities of a decoder’s initial probability outputs given the input of an encoder, then performing k basic greedy decodings given each of the frozen outputs as the initialization for decoded sequence.

Original languageEnglish (US)
Title of host publicationProceedings of the 14th International Conference on Educational Data Mining, EDM 2021
EditorsI-Han Hsiao, Shaghayegh Sahebi, Francois Bouchet, Jill-Jenn Vie
PublisherInternational Educational Data Mining Society
Pages422-430
Number of pages9
ISBN (Electronic)9781733673624
StatePublished - 2021
Event14th International Conference on Educational Data Mining, EDM 2023 - Paris, France
Duration: Jun 29 2021Jul 2 2021

Publication series

NameProceedings of the 14th International Conference on Educational Data Mining, EDM 2021

Conference

Conference14th International Conference on Educational Data Mining, EDM 2023
Country/TerritoryFrance
CityParis
Period6/29/217/2/21

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Information Systems

Cite this