
Full text loading...
Computational semantics has long been considered a field divided between logical and statistical approaches, but this divide is rapidly eroding with the development of statistical models that learn compositional semantic theories from corpora and databases. This review presents a simple discriminative learning framework for defining such models and relating them to logical theories. Within this framework, we discuss the task of learning to map utterances to logical forms (semantic parsing) and the task of learning from denotations with logical forms as latent variables. We also consider models that use distributed (e.g., vector) representations rather than logical ones, showing that these can be considered part of the same overall framework for understanding meaning and structural complexity.
Article metrics loading...
Full text loading...
Data & Media loading...