Top Qs
Timeline
Chat
Perspective
Lexicon-grammar
From Wikipedia, the free encyclopedia
Remove ads
Lexicon-Grammar is a method and a praxis of formalized description of human languages, [1] which considers that the systematic investigation of lexical entries is presently the main challenge of the scientific study of languages. [2] The development of Lexicon-Grammar began in the late 1960s under Maurice Gross.
An editor has determined that sufficient sources exist to establish the subject's notability. (October 2020) |
Its theoretical basis is Zellig S. Harris's[3] [4] distributionalism, [2] and notably the notion of transformational rule. [5] The notational conventions are meant to be as clear and comprehensible as possible. [6]
The method of Lexicon-Grammar is inspired from hard sciences. [7] It focuses on data collection, hence on the real use of language, both from a quantitative and observational point of view.
Lexicon-grammar also requires formalisation. [8] The results of the description must be sufficiently formal to be usable for natural language processing, in particular through the development of parsers. [9] [10] [11] [12] [13] The information model is such that the results of the description take the form of two-dimensional tables, also called matrices. [5] Lexicon-grammar tables cross-tabulate lexical entries with their syntactic-semantic properties. [2] [14] As a result, they make up a database of syntactic-semantic information. [15]
Experiments showed that several researchers or teams can make their observations cumulative. [16]
The term lexicon-grammar is used for the first time by Annibale Elia. [17]
Remove ads
Theoretical basis
The theoretical basis of Lexicon-grammar is Zellig Harris' distributionalism,[3] and in particular the notion of transformation in the sense of Zellig Harris. [5] [18] Maurice Gross was a student of Zellig Harris. The conventions for the presentation of grammatical information are intended to be as simple and transparent as possible. [6] This concern comes from Zellig Harris, whose theory is oriented towards directly observable surface forms; this differs from Generative Grammar, which normally uses abstract structures such as deep structures.
Remove ads
Fact collection
Summarize
Perspective
The Lexicon-Grammar method is inspired by experimental science. [7] It emphasizes the collection of facts, confronting the researcher with the reality of language use, from a quantitative and an observational point of view.[19]
Quantitatively: a lexicon-grammar includes a program of systematic description of the lexicon, including observing for each lexical entry in which syntactic constructions it occurs. [2] This involves large-scale work, which can be carried out by teams and not by individual specialists. The exclusive search for general rules of syntax, independent of the lexical material they handle, is dismissed as a dead end.[20] This is different from Generative Grammar, which values the notion of generalization.
Observationally: methodological precautions are applied to ensure good reproducibility of observations, and in particular to guard against the risks associated with constructed examples.[21] One of these precautions is to take as a minimum unit of meaning the basic sentence.[22] [23] [14] Indeed, a word acquires a precise meaning only in a context; moreover, by inserting a word in a sentence, one has the advantage of manipulating a sequence that may be judged acceptable or unacceptable. It is at this price that syntactic-semantic properties are considered as defined with sufficient precision that it makes sense to test and check them against the whole lexicon.[2] [5] [24] These precautions have evolved in line with needs and the appearance of new technical means. Thus, from the beginning of the 1990s, the contributors of the Lexicon-Grammar have been able to use attested examples in text corpora more and more easily. This new precaution has simply been added to the previous ones, positioning the Lexicon-Grammar simultaneously in introspective linguistics [25] and in corpus linguistics, much as advocated by Fillmore.[26] The American projects FrameNet[27] and VerbNet[28] show a relative convergence towards objectives close to those of Lexicon-Grammar.
Remove ads
Formalisation
Lexicon-grammar also requires formalisation.[8] The results of the description must be sufficiently formal to allow for:
- verification by comparison with the reality of language use;
- application to automatic language processing, and more particularly to deep linguistic processing, in particular through the development of parsers by computer scientists.[9] [10] [11] [12] [13]
References
See also
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads