Deep learning embeddings for discontinuous linguistic units

Wenpeng Yin, Hinrich Schütze

Research output: Contribution to conferencePaperpeer-review

Abstract

Deep learning embeddings have been successfully used for many natural language processing problems. Embeddings are mostly computed for word forms although a number of recent papers have extended this to other linguistic units like morphemes and phrases. In this paper, we argue that learning embeddings for discontinuous linguistic units should also be considered. In an experimental evaluation on coreference resolution, we show that such embeddings perform better than word form embeddings.

Original languageEnglish (US)
StatePublished - 2014
Event2nd International Conference on Learning Representations, ICLR 2014 - Banff, Canada
Duration: Apr 14 2014Apr 16 2014

Conference

Conference2nd International Conference on Learning Representations, ICLR 2014
Country/TerritoryCanada
CityBanff
Period4/14/144/16/14

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics
  • Education

Fingerprint

Dive into the research topics of 'Deep learning embeddings for discontinuous linguistic units'. Together they form a unique fingerprint.

Cite this