File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/81/p81-1002_intro.xml

Size: 9,077 bytes

Last Modified: 2025-10-06 14:04:20

<?xml version="1.0" standalone="yes"?>
<Paper uid="P81-1002">
  <Title>COM PUTATIONAL ('Obl PLEXITY AND LEXICAL FUNCTIONAL GRAMMAR</Title>
  <Section position="2" start_page="0" end_page="7" type="intro">
    <SectionTitle>
1. INTRODUCTION
</SectionTitle>
    <Paragraph position="0"> An important goal of ntodent linguistic theory is to characterize as narrowly as possible the class of natural !anguaooes. An adequate linguistic theory should be broad enough to cover observed variation iu human languages, and yet narrow enough to account for what might be dubbed &amp;quot;cognitive demands&amp;quot; -- among these, perhaps, the demands of lcarnability and pars,ability. If cognitive demands are to carry any real theoretical weight, then presumably a language may be a (theoretically) pos~ible human language, and yet be &amp;quot;inaccessible&amp;quot; because it is not leanmble or pa~able.</Paragraph>
    <Paragraph position="1"> Formal results along these lines have already been obtained for certain kinds of'rransformational Generative Grammars: for example, Peters and Ritchie \[I\] showed that Aspeel~-style unrest~ted transtbrmational grammars can generate any recursively cnumerablc set: while Rounds (2\] \[31 extended this work by demonstrating that modestly r~tricted transformational grammar~ (TGs) can generate languages whose recognition time is provhbly expm~cntial. (In Rounds&amp;quot; proof, transformatiocs are subject to a &amp;quot;terminal length non-decreasing&amp;quot; condition, as suggested by Peters and Myhill.) Thus, in the worst case TGs generate languages whose recognition is widely recognized to be computatiofrally intrdctable. Whether this &amp;quot;worst case&amp;quot; complexiw analysis has any real import for actual linguistic study has been the subject of ~me debate (for discussion, see Chomsky \[4l; Berwiek and Weinbcrg \[5\]). Without resolving that cuntroversy here howeser, one thin-gcan be said: to make TGs cmciendy parsable one might provide con~train~ For instance, these additional s'~'ictutes could be roughly of the sort advocated in Marcus' work on patsinB \[6\] -- constraints specifying that TG-based languages must haw parsers that meet certain &amp;quot;lecality conditions&amp;quot;. The Marcus' constraints apparently amount to an extension of Knuth's l.,R(k) locality condition \[7\] to a (restricted) version of a two-stack deterministic push-down automaton. (The need tbr LR(k)-like restrictions in order to ensure efficient processability was also recognized by Rounds \[21.) Recently, a new theory of grammar has been advanced with the explictiy stated aim of meeting the dual demands of tearnability and pa~ability - the Lexical Functional Grammars (LFGs) of Bresnan \[!~ I. The theory of l.exical Functional Grammars is claimed to have all the dc~riptive merits of transformational grammar, but none of its compotational unruliness, In t.FG, there are no transformations (as classically described); the work tbrmerly ascribed to transformations such as &amp;quot;passive&amp;quot; is shouldered by information stored in Ibxical entries associated with lexical items. The climmation of transformational power naturally gives rise to the hope that a lexically-based system would be computationally simpler than a transformational one.</Paragraph>
    <Paragraph position="2"> An interesting question then is to determine, as has already been done for the case of certain brands of transformational grammar, just what the &amp;quot;worg case&amp;quot; conlputational complexity for the recognition of LFG languages is. If the recognititm time complexiW for languages generated by the basic LFG rheas can be as complcx as that for languages generated by a modestly restricted U'ansfunnational system, then presumably \[.FG will also have to add additional coastraiuts, beyond those provided in its basic theory, in order ',u ensure efficient parsability.</Paragraph>
    <Paragraph position="3"> The main result of this paper is to show that certain \[.exical Functional Grammars can generate languages whose recognition time /s very likely ct~mput.'xtionally intractable, at Ie,'LSt a~urding to our current understanding of wl~at is or is not rapidly solvable. Briefly. the demonstration proceeds by showing how a problem that is widely conjectured to be cumputationally dimcult -- namely, whether there exists ~n ~%ignment of Us and O's (or '*T&amp;quot;s and &amp;quot;l~'s) to tire litcrals ofa Bta~lcan formula in conjunctive normal form that makes the forrnula evaluate to &amp;quot;I&amp;quot; (or &amp;quot;tree&amp;quot;) -- can be re-expressed as the prublcm of recognizing whctl~er a particular string is or is uot a member uf the language generated by a certain lexical functional grammar. This &amp;quot;reduction&amp;quot; shows that in the worst case the recognitinn of I.FG lanp, uages can be just as hard as the original Boolean satisfiability problem. Since k is widcly conjectured that there cannot be a polynomial-time alguriti'n'n for satisfiabiliW (the problem is NP-complete), there canno~, be a polynomial-dine recognition algorithm for LFG's in general either. Note that this result sharpens that in Kaplan and Bresnan \[81: there it is shown only that LFG's (weakly) generate some subset of the class of context-sensitive languages (including some strictly context-sensitive languages) and therefore, in the worst case, exponential time is known to be sufficient (though not necessary) to reaognize any LFG language. The result in \[81 thus does not address the question of how much time, in the worst case, is necesmry to recognize LFG languages. The result of this paper indicates that in the worst case more than pnlynomial time will probably be necessary. (The reason for the hedlp.&amp;quot; &amp;quot;probably&amp;quot; will become apparent below; it hinges upon the central unsolved conjecture of current complexity theory.) In short then, this result places the * LFG languages more precisely in the complexity hierarchy.</Paragraph>
    <Paragraph position="4"> It also toms out to be instructive to inquire into just why a lexically-based approach can tom out to be compurationally difficult, and how computational tractability may be guaranteed. Advocates of lexically-based theories may have thought (and some Pave explicitly stated) that the banishment of transformations is a compumdonally wise move because transformations are computationally &amp;quot;expensive.&amp;quot; Eliminate the transformations, so this casual argument goes, and one has eliminated all comptitational problents. In~guingiy though, when one examines the proof to be given below, the computational work done by transformations in older theories re, emerges in the lexical grammar as the problem of choosing between alternative categorizations for lexical items - deciding, in a manner of speaking, whether a particular terminal item is a Noun or a Verb (as with the word k/ss in English). This power .of choice, coupled with an ability to express co-occurrence constraints over arbitrary distances across terminal tokens in a string (as in Subjeat-Verb number agreement) seems to be all that is required to make the recognition of LFG languages intr~table. The work doee by transformations has been exchanged for work done by lexieM ~.hemas. but the overall computational burden remains mugidy the same.</Paragraph>
    <Paragraph position="5"> This leaves the question posed in the opening paragraph: jug what sorts of constraints on natural languages are required in order to ensure efficient parsabil)tg? An infoqrln~ argume.nt can be made that Marcus' work \[6} provides a good first attack on just this kind of characteriza~n. M~x:us' claim was that languages easily parsed {not &amp;quot;garden-pathed&amp;quot;) by oC/oole could be precisely modeled by the languages easily pm'sed by a certain type of restricted, deterministic, two-stack parsing machine. But this machine can be spawn to be a (weak) non-canonical extension of the I,R(k) grammars, as proposed by Knuth \[51.</Paragraph>
    <Paragraph position="6"> Finally, this paper will discuss the relevance of this technical result for more down-to-earth computational linguistics. As it turns out, even though 2eneral LFG's may well be computationally intractable, it is easy to imagine a variety of additional constraints for I..FG theory that provide a way to sidestep arovr,d the reduction argument. All of these additional r~trictions amount to making the LFG theory more restricted, in such a way that the reduction argument cannot be made to work. For example, one effective restriction is to stipulate that there can only be a finite stock of features with which to label Icxical items. In any case, the moral of the story is an unsurprising one: specificity and constraints can absolve a theory of computational intr~tability. What may be more surprising is that the requisite locality constraints seem to be useful for a variety of theories of grammar, from transformational grmnmar to lexieal functional gr,'unmar.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML