File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/96/c96-2122_intro.xml

Size: 5,493 bytes

Last Modified: 2025-10-06 14:06:03

<?xml version="1.0" standalone="yes"?>
<Paper uid="C96-2122">
  <Title>An Earley-type recognizer for dependency grammar</Title>
  <Section position="2" start_page="0" end_page="723" type="intro">
    <SectionTitle>
1 Introduction
</SectionTitle>
    <Paragraph position="0"> Dependency and constituency frameworks define different syntactic structures. Dependency grammars describe the structure of a sentence in terms of binary head-modifier (also called dependency) relations on the words of the sentence. A dependency relation is an asymmetric relation between a word callexl head (governor, parent), and a word called modifier (dependent, daughter). A word in the sentence can play the role of the head in several dependency relations, i.e. it can have several modifiers; but each word can play the role of the modifier exactly once.</Paragraph>
    <Paragraph position="1"> One special word does not play the role of the modifier in any relation, and it is named the root. The set of the dependency relations that can be defined on a sentence form a tree, called the dependency tree (fig. la).</Paragraph>
    <Paragraph position="2"> Although born in the same years, dependency syntax (Tesniere 1959) and constituency, or phrase structure, syntax (Chomsky 1956) (see fig.lb), have had different impacts. The mainstream of formalisms consists ahnost exclusively of constituency approaches, but some of the original insights of the dependency tradition have found a role in the constituency formalisms: in particular, the concept of head of a phrase and the use of grammatical relations. The identification of the head within a phrase has been a major point of all the recent frameworks in linguistics: the X-bar theory (Jackendoff 1977), defines phrases as projections of (pre)terminal symbols, i.e. word categories; in GPSG (Gazdar et al, 1985) and HPSG (Pollard, Sag 1987), each phrase structure rule identifies a head and a related subcategorization within its right-hand side; in HG (Pollard 1984) the head is involved in the so-called head-wrapping operations, which allow the formalism to go beyond the context-free power (Joshi et al. 1991).</Paragraph>
    <Paragraph position="3"> Grmmnatical relations are the primitive entities of relational grammar (Perhnutter 1983) (classified as a dependency-based theory in (Mercuk 1988)):  tree (b) for the sentence &amp;quot;The chef cooked a fish&amp;quot;. The leftward or rightward orientation of the arrows in the dependency tree represents the order constraints: the modifiers that precede the head stand on its left, the modifiers that follow the head stand on its right.</Paragraph>
    <Paragraph position="4"> subject, object, xcomplement .... label the dependency relations when the head is a verb. Grainmatical relations gained much popularity within the unification formalisms in early 1980%. FUG (Kay 1979) and LFG (Kaplan, Bresnan 1982) exhibit mechanisms for producing a relational (or functional) structure of the sentence, based on the merging of feature representations.</Paragraph>
    <Paragraph position="5"> All the recent constituency formalisms acknowledge the importance of the lexicon, and reduce the amount of information brought by the phrasal categories. The &amp;quot;lexicalization&amp;quot; of context-free grmnmars (Schabes, Waters 1993) points out many similarities between the two paradigms (Rainbow, Joshi 1992). Dependency syntax is an extremely lexicalized framework, because the phrase structure component is totally absent. Like the other lexicalized frameworks, the dependency approach does not produce spurious grammars, and this facility is of a practical interest, especially in writing realistic grammars. For instance, there are no heavily ambiguous, infinitely ambiguous or cyclic dependency grammars (such as S ~ SS; S ~ a; S --* ~; see (Tomita 1985), pp. 72-73).</Paragraph>
    <Paragraph position="6">  Dependency syntax is attractive because of the immediate mapping of dependency structures on the predicate-argmnents structure (accessible by the semantic interpreter), and because of the treatment of free-word order constructs (Sgall et al. 1986) (Mel'cuk 1988) (Hudson 1990). A number of parsers have been developed for some dependency frameworks (Fraser 1989) (Covington 1990) (Kwon, Yoon 1991) (Sleator, Temperley 1993) (Hahn et al.</Paragraph>
    <Paragraph position="7"> 1994) (Lai, Huang 1995): however, no result of algorithmic efficiency has been published as far as we know. The theoretical worst-case analysis of O(n 3) descends from the (weak) equivalence between projective dependency grammars (a restricted of dependency grammars) and context-free grammars (Gaifman 1965), and not from an actual parsing algorithm, This paper is a first attempt to fill a gap in the literature between the linguistic merits of the dependency approach (widely debated) and the mathematical properties of such formalisms (quite negleted). We describe an improved Earley-type recognizer for a projective dependency formalism. As a starting point we have adopted a restricted dependency formalism with context-free power, that, for the sake of clearness, is described in the notation introduced by Gaifman (1965). The dependency grammar is translated into a set of parse tables that determine the conditions of applicability of the primary parser operations. Then the recognition algorithm consults the parse tables to build the sets of items as in Earley's algorithm for context-free grammars.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML