The Role of Lexicons in Natural Language Processing.

Autor: Guthrie, Louise1 guthrie@mdso.ve.ge.com, Pustejovsky, James2 jamesp@cs.brandeis.edu, Wilks, Yorick3 yorick@dcs.sheffield.ac.uk, Slator, Brian M.4 slator@ils.nwu.edu
Předmět:
Zdroj: Communications of the ACM. Jan1996, Vol. 39 Issue 1, p63-72. 10p.
Abstrakt: This article explores the growing relations between dictionaries and computation and, in particular, investigates whether what is found in traditional dictionaries can be of service to those concerned with getting computers to process and understand natural human languages. Although machine translation, by methods now considered superficial by most who work in artificial intelligence (AI) processed large volumes of text by the mid-1960s, AI researchers who aimed for a modeling of what most people would call understanding text, wrote programs that processed only a few sentences. In a moment of great honesty 5 years ago, a group of AI researchers of natural language processing admitted in public how many words there really were in vocabularies of their systems. Remedies for this situation have been to move to a larger scale and more empirical methods. These have ranged from the use of neural or connectionist networks in language understanding to a return to statistical methods for the analysis of texts, as well as attempts to increase the size of lexicons for computational systems.
Databáze: Library, Information Science & Technology Abstracts