Top Qs
Timeline
Chat
Perspective
Augmented transition network
From Wikipedia, the free encyclopedia
Remove ads
An augmented transition network or ATN is a type of graph theoretic structure used in the operational definition of formal languages, used especially in parsing relatively complex natural languages, and having wide application in artificial intelligence. An ATN can, theoretically, analyze the structure of any sentence, however complicated. ATN are modified transition networks and an extension of RTNs[citation needed].
ATNs build on the idea of using finite-state machines (Markov model) to parse sentences. W. A. Woods in "Transition Network Grammars for Natural Language Analysis" claims that by adding a recursive mechanism to a finite state model, parsing can be achieved much more efficiently. Instead of building an automaton for a particular sentence, a collection of transition graphs are built. A grammatically correct sentence is parsed by reaching a final state in any state graph. Transitions between these graphs are simply subroutine calls from one state to any initial state on any graph in the network. A sentence is determined to be grammatically correct if a final state is reached by the last word in the sentence.
This model meets many of the goals set forth by the nature of language in that it captures the regularities of the language. That is, if there is a process that operates in a number of environments, the grammar should encapsulate the process in a single structure. Such encapsulation not only simplifies the grammar, but has the added bonus of efficiency of operation. Another advantage of such a model is the ability to postpone decisions. Many grammars use guessing when an ambiguity comes up. This means that not enough is yet known about the sentence. By the use of recursion, ATNs solve this inefficiency by postponing decisions until more is known about a sentence.
Remove ads
Example
Summarize
Perspective
An augmented transition network for parsing noun phrases. The diagrams depict two ATNs used by the Bolt Beranek and Newman (BBN) "Hear-What-I-Mean" (HWIM) speech-understanding system[1] of the mid-1970s, produced for ARPA's Speech Understanding Research project in the 1970s.[2] It was intended to parse sentences that are questions about travel budgets, like "What is the plane fare to Ottawa?".
ATNs are finite-state graphs whose arcs may perform tests (e.g. part-of-speech checks) and actions (e.g. pushing or popping a subsidiary network).
In the diagram:
- Every state is a node, and every arc is a labelled arrow. A double-circled state is the start state.
- Lexical or category tests appear on the arc label like
CAT DET
. - Procedural actions appear on the arc label like
PUSH PP/
orPOP
. - A tiny tree fragment to the right illustrates the structure that will be returned to the calling ATN after the
POP
.
Generic noun-phrase ATN

This ATN recognizes (or generates) some English noun phrases.
The graph therefore parses sequences such as:
\[DET the] \[ADJ big] \[N house] \[PP on the hill]
where the \[PP on the hill]
would have more structure inside it, illustrated by the small triangle beneath the output PP
.
Trip-specific noun-phrase ATN

This ATN is a variant specialized for parsing sentences that are about travel. It inherits the generic pattern but hard-codes trip-related words and adjuncts.
A complete generation trace for "a cheap trip to Paris by Tom" is therefore:
- Start at
TRIP/
and read determiner "a". - Loop once on
CAT TRIP-ADJ
to accept "cheap". - Consume "trip" via
WRD TRIP
. - Read keyword "to".
- Push the
PLACE/
network, yielding "Paris". - Encounter "by", switch to state
T/BY
, then accept the proper-object "Tom". - Pop, returning the fully-expanded
NP
subtree.
Remove ads
See also
References
External links
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads