File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/88/j88-4001_intro.xml

Size: 5,659 bytes

Last Modified: 2025-10-06 14:04:43

<?xml version="1.0" standalone="yes"?>
<Paper uid="J88-4001">
  <Title>LFP: A LOGIC FOR LINGUISTIC DESCRIPTIONS AND AN ANALYSIS OF ITS COMPLEXITY</Title>
  <Section position="2" start_page="0" end_page="0" type="intro">
    <SectionTitle>
DESCRIPTION
</SectionTitle>
    <Paragraph position="0"> In this paper we investigate the properties of a new notation for specifying syntax for natural languages. It is based on the simple idea that first-order logic, though inadequate as a semantics for natural language, is quite adequate to express relations between syntactic constituents. This is the insight behind definite clause grammars (DCGs) (Pereira and Warren 1980) and, in fact, our notation is in some ways a generalization of that notation. However, we have tried to keep our formalism as much as possible like that of standard textbook first-order logic. There are actually two versions of our notation. The first works with strings of symbols and uses concatenation as a primitive operation. The second works with integers and takes the standard arithmetic operations as primitive. These integers can be regarded as indexing positions of morphemes in a sentence, but the sentence itself is not explicitly referenced. Both versions allow the recursive definition of predicates over strings and integers. This capacity for recursive definition is what gives our grammars their generative ability, and our notation has this feature in common with DCGs. However, we liberate DCGs from the Horn clause format, and we do not base the semantics of our notation on the semantics for Prolog or for logic programs. We hope that making the logic more familiar and readable will encourage more people to use logic as a means for specifying desired syntactic relations between sentential constituents in grammars. Anyone knowing the standard conventions of first-order logic should be able to read or to specify a grammar in our notation.</Paragraph>
    <Paragraph position="1"> We also provide a precise semantics for our two notations. This involves using the least-fixed-point operator from denotational semantics for programming languages to explain the recursive definition of predicates. It involves as well using restricted universal and existential quantification to restrict the class of definable predicates (sets of strings). We prove a complexity theoretic characterization for both grammar formalisms: (1) the formalism using strings and concatenation defines exactly the class EXPTIME of formal languages recognizable by deterministic Turing machines within time T(n) = 2 en for some positive c; and (2) the formalism using integers defines exactly the class PTIME of languages recognizable in time T(n) = n k for some integer k.</Paragraph>
    <Paragraph position="2"> As an application of the second notation we sketch a natural way to write Head Grammars (Pollard 1984).</Paragraph>
    <Paragraph position="3"> Because these grammars can be expressed in this way, we immediately obtain the result that head languages can be recognized in polynomial time. We even obtain an estimate of the degree of the polynomial that is required, derived directly from the form of the grammatical description. Unfortunately, the estimated degree is at least twice as large as is actually necessary if one uses the algorithm of Pollard (1984), or Vija-Copyright 1988 by the Association for Computational Linguistics. Permission to copy without fee all or part of this material is granted provided that the copies are not made for direct commercial advantage and the CL reference and this copyright notice are included on the first page. To copy otherwise, or to republish, requires a fee and/or specific permission. 0362-613X/88/01001-9503.00 Computational Linguistics, Volume 14, Number 4, December 1988 1 William C. Rounds LFP: A Logic for Linguistic Descriptions and an Analysis of its Complexity yashanker and Joshi (1985). We conjecture that in fact, this factor of (2) can be removed.</Paragraph>
    <Paragraph position="4"> Our complexity theoretic characterizations are versions of theorems already appearing in the literature.</Paragraph>
    <Paragraph position="5"> Shapiro (1984) characterizes the class of languages definable by logic programs with a linear space restriction as the class EXPTIME. The proof of our first theorem is very much like his. Our second theorem characterizing PTIME can be viewed as a specialization of the results of Chandra and Harel (1982), Immerman (1982), and Vardi (1982), who show that the class of sets of finite mathematical structures definable by formulas of first-order logic augmented with a least-fixed-point operator is just the class of sets of structures recognizable in polynomial time. We prove both of our results in the same way, and thus show how these apparently unconnected theorems are related. The proof uses the notion of alternating Turing machines, and thereby explains the significance of this idea for the science of formal linguistics.</Paragraph>
    <Paragraph position="6"> We should also note that our notation will not find immediate application in current linguistic theory, because it does not allow structural descriptions to be described. We are in the process of extending and modifying the notation for this purpose. However, we think it is important to explicate the properties of the individual operations used in building strings and structures. Our first attempt is therefore to understand how concatenation of strings can be expressed in a restricted logic. We can then consider other predicates or functions on both strings and treelike structures in the same uniform way.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML