File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/98/w98-0511_intro.xml
Size: 5,183 bytes
Last Modified: 2025-10-06 14:06:45
<?xml version="1.0" standalone="yes"?> <Paper uid="W98-0511"> <Title>Parsing with Dependency Relations and Robust Parsing</Title> <Section position="2" start_page="0" end_page="96" type="intro"> <SectionTitle> 1. Introduction </SectionTitle> <Paragraph position="0"> Our team has been working with dependency grammars for more than twenty-five years (Courtin 73). Two dependency parsers built by our team are presented in this paper. The first one uses the notion of dependency relations in order to implement dependency grammars efficiently; it is described in the first part of the text. The second one was built with the following objectives: adding the use of some semantic knowledge in the process of syntactic parsing and obtaining a robust parser (second part of the text).</Paragraph> <Paragraph position="1"> 2. Parsing with dependency relations The linguistic model we use for dependency is inspired by the Tesni&e model (Tesni~re 59), which we will recall shortly in order to define precisely our terminology.</Paragraph> <Paragraph position="2"> 2.1. The linguistic model Relationship between words is the fundamental concept associated with dependency structures (DS). Given two words of the language, a relation is established between them, defining a dominated word (or dependent) and a dominating word (or governor). This relation can be represented by an arc between two nodes, where each node is labelled by a word. The arc descends from the governor to the dependent.</Paragraph> <Paragraph position="3"> Example: the dependency structure for the sentence (, we present two parsers ,~: / present we ~ parsers We can also use a linear notation with brackets and write: (we) present ((two) parsers).</Paragraph> <Paragraph position="4"> But the graphical representation is more readable and shows clearly the hierarchy between the governor and its dependents, which of course, can also have dependents.</Paragraph> <Paragraph position="5"> Dependency grammars A dependency grammar (formalism used by (Hays 64)) on a vocabulary V is made of: * a family of parts Ci of V such that the union of Ci is equal to V.</Paragraph> <Paragraph position="6"> * a set of rules, each having one of the two following forms:</Paragraph> <Paragraph position="8"> Ci are word classes or lexico-syntactic categories and are denoted by their name (Determiner, Noun, Adjective,...). Xi in the rules above are category names.</Paragraph> <Paragraph position="9"> The star shows the place of the governor relatively to its dependents, so in a type ii) rule, X l...Xi are left dependents of the X governor and Xi+l ... Xn are its right dependents.</Paragraph> <Paragraph position="10"> When n = 0, the rule is written X(*) and is a terminating rule; type i)rules are initial rules.</Paragraph> <Paragraph position="11"> Grammar example: We use the following categories: Determiner (D), V={drinks, eats) D={the, a} N={dog, cat, cup, milk) A=tblack, white, hot} With this grammar one can build the structure: drinks / cat: ~milk hot: Generation Dependency grammars are generative, working with the following generating rules: a) choose a type i) rule (which determines the main governor), b) choose and apply type ii) rules until we obtain a complete structure, entirely made of terminating rules.</Paragraph> <Paragraph position="12"> With the example grammar above, we can make the following derivation (which matches the sentence: ~ the black cat drinks hot milk ,0:</Paragraph> <Paragraph position="14"> Remark: For a given governor, the dependency grammar must contain as many rules as there are possible configurations of dependents below this governor. For example, if we want nominal phrases with at least a noun, an optional determiner and 0,1 or 2 adjectives before the noun, we will have the grammar:</Paragraph> <Paragraph position="16"> The formalism proposed below shows a better way to describe the same things.</Paragraph> <Section position="1" start_page="95" end_page="96" type="sub_section"> <SectionTitle> 2.2. Dependency relations </SectionTitle> <Paragraph position="0"> The method used in the PILAF ! system (Courtin 77) to build dependency structures is a direct analysis: we transform the input word chain in a dependency tree by using a form of dependency grammar and no intermediate structure.</Paragraph> <Paragraph position="1"> But the algorithm does not directly use Tesni~re type dependency grammars because, as we seen before, these grammars impose a combinatorial description of all the possible configurations of dependents for a given governor. To overcome this drawback, we introduce dependency relations between two lexico-syntactic categories. Example: To say that N governs the g we simply write N -> Jt Dependency Relations (DR) must not only code the relation itself but also: * the relative positions of the dependent and the governor: is it a left dependent or a right one ? * the relative positions of all dependents of a given governor.</Paragraph> <Paragraph position="2"> Example: We want to describe the sentence ,~ The black cat drinks hot milk ,~ which gives the sequence of categories:</Paragraph> </Section> </Section> class="xml-element"></Paper>