Top Qs
Timeline
Chat
Perspective
Cross-serial dependencies
Term in linguistic syntax From Wikipedia, the free encyclopedia
Remove ads
In linguistics, cross-serial dependencies (also called crossing dependencies by some authors[1]) occur when the lines representing the dependency relations between two series of words cross over each other.[2] They are of particular interest to linguists who wish to determine the syntactic structure of natural language; languages containing an arbitrary number of them are non-context-free. By this fact, Dutch[3] and Swiss-German[4] have been proven to be non-context-free.

Remove ads
Example
As Swiss-German allows verbs and their arguments to be ordered cross-serially, we have the following example, taken from Shieber:[4]
...mer | em Hans | es | huus | hälfed | aastriiche. |
...we | Hans (dat) | the | house (acc) | help | paint. |
That is, "we help Hans paint the house."
Notice that the sequential noun phrases em Hans (Hans) and es huus (the house), and the sequential verbs hälfed (help) and aastriiche (paint) both form two separate series of constituents. Notice also that the dative verb hälfed and the accusative verb aastriiche take the dative em Hans and accusative es huus as their arguments, respectively.
Remove ads
Non-context-freeness
Summarize
Perspective
Let to be the set of all Swiss-German sentences. We will prove mathematically that is not context-free.
In Swiss-German sentences, the number of verbs of a grammatical case (dative or accusative) must match the number of objects of that case. Additionally, a sentence containing an arbitrary number of such objects is admissible (in principle). Hence, we can define the following formal language, a subset of :Thus, we have , where is the regular language defined by where the superscript plus symbol means "one or more copies". Since the set of context-free languages is closed under intersection with regular languages, we need only prove that is not context-free (,[5] pp 130–135).
After a word substitution, is of the form . Since can be mapped to by the following map: , and since the context-free languages are closed under mappings from terminal symbols to terminal strings (that is, a homomorphism) (,[5] pp 130–135), we need only prove that is not context-free.
is a standard example of non-context-free language (,[5] p. 128). This can be shown by Ogden's lemma.
Suppose the language is generated by a context-free grammar, then let be the length required in Ogden's lemma, then consider the word in the language, and mark the letters . Then the three conditions implied by Ogden's lemma cannot all be satisfied.
All known spoken languages which contain cross-serial dependencies can be similarly proved to be not context-free.[2] This led to the abandonment of Generalized Phrase Structure Grammar once cross-serial dependencies were identified in natural languages in the 1980s.[6]
Remove ads
Treatment
Research in mildly context-sensitive language has attempted to identify a narrower and more computationally tractable subclass of context-sensitive languages that can capture context sensitivity as found in natural languages. For example, cross-serial dependencies can be expressed in linear context-free rewriting systems (LCFRS); one can write a LCFRS grammar for {anbncndn | n ≥ 1} for example.[7][8][9]
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads