Stochastic Attribute-Value Grammars
by Rob Malouf, Miles Osborne
Publisher: ESSLLI 2001
Number of pages: 159
This one-week course will provide an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. Through a combination of lectures and, as local computing facilities permit, hands-on lab exercises, students will investigate the implementation of maximum entropy models for attribute-value grammars, including such topics as ambiguity identification, feature selection, and model training and evaluation.
Download or read it online for free here:
by Igor Boshakov, Alexander Gelbukh
The book focuses on the basic set of ideas and facts from the fundamental science necessary for the creation of intelligent language processing tools, without going deeply into the details of specific algorithms or toy systems.
by Dan Jurafsky, James H. Martin - Stanford University
This text takes an empirical approach to the subject, based on applying statistical and machine-learning algorithms to large corporations. The authors describe a unified vision of speech and language processing. Emphasis is on practical applications.
by Edward Stabler - UCLA
What kind of computational device could use a system like a human language? This text explores the computational properties of devices that could compute morphological and syntactic analyses, and recognize semantic relations among sentences.
by Joseph D. Booth - Syncfusion, Inc.
Author will guide readers through designing a simple system that can interpret and provide reasonable responses to written English text. With this foundation, readers will be prepared to tackle the greater challenges of natural language development.