Abstract
Deep learning embeddings have been successfully used for many natural language processing problems. Embeddings are mostly computed for word forms although a number of recent papers have extended this to other linguistic units like morphemes and phrases. In this paper, we argue that learning embeddings for discontinuous linguistic units should also be considered. In an experimental evaluation on coreference resolution, we show that such embeddings perform better than word form embeddings.
Original language | English (US) |
---|---|
State | Published - 2014 |
Event | 2nd International Conference on Learning Representations, ICLR 2014 - Banff, Canada Duration: Apr 14 2014 → Apr 16 2014 |
Conference
Conference | 2nd International Conference on Learning Representations, ICLR 2014 |
---|---|
Country/Territory | Canada |
City | Banff |
Period | 4/14/14 → 4/16/14 |
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Linguistics and Language
- Language and Linguistics
- Education