Building large learning models with herbal

Jaehyon Paik, Jong W. Kim, Frank E. Ritter, Jonathan H. Morgan, Steven R. Haynes, Mark A. Cohen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

In this paper, we describe a high-level behavior representation language (Herbal) and report new work regarding Herbal's ACT-R compiler. This work suggests that Herbal reduces model development time by a factor of 10 when compared to working directly in Soar, ACT-R, or Jess. We then introduce a large ACT-R model (541 rules) that we generated in approximately 8 hours. We fit the model to learning data. The comparison indicates that humans performing spreadsheet tasks appeared to start with some expertise. The comparison also suggests that ACT-R, when processing tasks consisting of hundreds of unique memory elements over times spans of twenty to forty minutes, may have problems accurately representing the learning rates of humans. In addition, our study indicates that the spacing between learning sessions has significant effects that may impact the modeling of memory decay in ACT-R.

Original languageEnglish (US)
Title of host publicationProceedings of the 10th International Conference on Cognitive Modeling, ICCM 2010
Pages187-192
Number of pages6
StatePublished - 2010
Event10th International Conference on Cognitive Modeling, ICCM 2010 - Philadelphia, PA, United States
Duration: Aug 5 2010Aug 8 2010

Other

Other10th International Conference on Cognitive Modeling, ICCM 2010
Country/TerritoryUnited States
CityPhiladelphia, PA
Period8/5/108/8/10

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'Building large learning models with herbal'. Together they form a unique fingerprint.

Cite this