1932

Abstract

Computational semantics has long been considered a field divided between logical and statistical approaches, but this divide is rapidly eroding with the development of statistical models that learn compositional semantic theories from corpora and databases. This review presents a simple discriminative learning framework for defining such models and relating them to logical theories. Within this framework, we discuss the task of learning to map utterances to logical forms (semantic parsing) and the task of learning from denotations with logical forms as latent variables. We also consider models that use distributed (e.g., vector) representations rather than logical ones, showing that these can be considered part of the same overall framework for understanding meaning and structural complexity.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-linguist-030514-125312
2015-01-14
2025-06-22
Loading full text...

Full text loading...

/content/journals/10.1146/annurev-linguist-030514-125312
Loading
/content/journals/10.1146/annurev-linguist-030514-125312
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error