File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/85/p85-1021_intro.xml
Size: 7,688 bytes
Last Modified: 2025-10-06 14:04:26
<?xml version="1.0" standalone="yes"?> <Paper uid="P85-1021"> <Title>amp;quot;Grammatical Relations and Montague Grammar&quot;,</Title> <Section position="3" start_page="0" end_page="167" type="intro"> <SectionTitle> 1 Syntax </SectionTitle> <Paragraph position="0"> HPSG is a lexically based theory of phrase structure, so called because of the central role played by grammlttical heads and their associated complements.' Roughly speaking, heads are linguistic forms (words and phrases) tl, at exert syntactic and semantic restrictions on the phrases, called complements, that characteristically combine with them to form larger phrases.</Paragraph> <Paragraph position="1"> Verbs are the heads of verb phrm~es (apd sentences), nouns are the heads of noun phra~es, and so forth.</Paragraph> <Paragraph position="2"> As in most current syntactic theories, categories are represented as complexes of feature specifications.</Paragraph> <Paragraph position="3"> But the \[IPSG treatment of lcxical subcategorization obviates the need in the theory of categories for the notion of bar-level (in the sense of X-bar theory, prevalent in much current linguistic research}. \[n addition, the augmentation of the system of categories with stack-valued features - features whose values ~re sequences of categories - unilies the theory of lexical subcategoriz~tion with the theory of bi,~ding phenomena. By binding pimnomena we meaa essentially noJL-clausebounded delmndencies, ,'such a.~ th~rse involving dislocated constituents, relative ~Lnd interrogative pronouns, and reflexive and reciprocal pronouns \[12 I.</Paragraph> <Paragraph position="4"> * iIPSG ul a relinwlJ~iC/ ~ld C/zt.,~nsioll ,,f th~ clu~dy rel~tteu Gt~lmr~dilmd PhC/.'tme Structulm Gran|n|ar lTI. The detaaJs uf lily tllt~J/-y of HPSG ar~ Nt forth in Ii|\[. More precisely, the subcategorization of a head is encoded as the value of a stack-valued feature called ~SUBCAT&quot;. For example, the SUBCAT value of the verb persuade is the sequence of three categories IVP, NP, NP I, corresponding to the grammatical relations (GR's): controlled complement, direct object, and sub-ject respectively. We are adopting a modified version of Dowty's \[19821 terminology for GR's, where subject &quot;LS last, direct object second-to-last, etc. For semantic reasons we call the GR following a controlled complement the controller.</Paragraph> <Paragraph position="5"> One of the key differences between HPSG and its predecesor GPSG is the massive relocation of linguistic information from phrase structure rules into the lexicon \[5\]. This wholesale lexicalization of linguistic information in HPSG results in a drastic reduction in the number of phrase structure rules. Since rules no longer handle subcategorization, their sole remaining function is to encode a small number of language-specific principles for projecting from \[exical entries h, surface constituent order.</Paragraph> <Paragraph position="6"> The schematic nature of the grammar rules allows the system to parse a large fragment of English with only a small number of rules (the system currently uses sixteen), since each r1,le can be used in many different situations. The constituents of each rule are sparsely annotated with features, but are fleshed out when taken together with constituents looked for and constituents found.</Paragraph> <Paragraph position="7"> For example the sentence The manager works can be parsed using the single rule RI below. The rule is applied to build the noun phrase The manager by identifying the head H with the \[exical element manaqer and tile complement CI with the lexical element the. The entire sentence is built by ideutifying the H with works and the C1 with the noun phrase described above. Thus the single rule RI functions as both the S -* NP VP, and NP ~ Det N rules of familiar context fRe grammars.</Paragraph> <Paragraph position="8"> R1. x -> ci hi(CONTROL INTRANS)\] a* Figure I. A Grammar Rule.</Paragraph> <Section position="1" start_page="167" end_page="167" type="sub_section"> <SectionTitle> \]Feature Passing </SectionTitle> <Paragraph position="0"> The theory of HPSG embodies a number of substantive hypotheses about universal granunatical principles, Such principles as the Head Feature Principle, the Binding Inheritance Principle, and the Control Agreement Principle, require that certain syntactic features specified on daughters in syntactic trees are inherited by the mothers. Highly abstract phrase structure rules thus give rise to fully specified grammatical structures in a recursive process driven by syntactic information encoded on lexical heads. Thus HPSG, unlike similar ~unification-based&quot; syntactic theories, embodies a strong hypothesis about the flow of relevant information in the derivation of complex structures.</Paragraph> <Paragraph position="1"> Unification Another important difference between HPSG and other unification based syntactic theories concerns the form of the expressions which are actually unified.</Paragraph> <Paragraph position="2"> In HPSG, the structures which get unified are (with limited exceptions to be discussed below) not general graph structures as in Lexical Functional Qrammar \[1 I, or Functional Unification Granmaar IlOI, but rather fiat atomic valued feature matrices, such as those ~hown below.</Paragraph> <Paragraph position="3"> \[(CONTROL 0 INTRANS) (MAJ N A) (AGR 3RDSG) (PRD MINUS) (TOP MINUS)\] \[(CONTROL O) (MAJ H V) (INV PLUS)\] In the implementation of \[\[PSG we have been able to use this restrictiou on the form of feature tnatrices to good advantage. Since for any given version of the system the range of atomic features and feature values is fixed, we are able to represent fiat feature matrices, such as the ores above, as vectors of intcKers, where each cell in the vector represents a feature, and ~he integer in each cell represents a disjunctioa of tile possible values for that feature.</Paragraph> <Paragraph position="5"> For example, if the possible values of the MAJ feature are N, V, A, and P then we can uuiquely represent any combination of these features with an integer in the raalge 0..15. This is accomplished simply by a~igning each possible value an index which is an integral power of 2 in this range and then adding up the indices so derived for each disjunction of values encountered.</Paragraph> <Paragraph position="6"> Unification in such cases is thus reduced to the &quot;logical and&quot; of the integers in each cell of the vector representing the feature matrix. In this way unification of these flat structures can be done in constant time, and since =logical and&quot; is generally a single machine instruction the overhead is very low.</Paragraph> <Paragraph position="8"> There are, however, certain cases when the values of features are not atomic, but are instead themselves feature matrices. The unification of such structures could, in theory, involve arbitrary recursion on the general unification algorithm, and it would seem that we had not progressed very far from the problem of unifying general graph structures. Happily, the features for which this property of embedding holds, constitute a small finite set (basically tlte so called &quot;binding features&quot;). Thus we are able to segregate such features from the rest, and recurse only when such a &quot;category valued ~ feature is present. \[n practice, therefore, the time performance of the general uailication algorithm is very good, essentially the sanze a.s that of the lint structure unification algorithm described above.</Paragraph> </Section> </Section> class="xml-element"></Paper>