File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/85/p85-1013_metho.xml

Size: 33,683 bytes

Last Modified: 2025-10-06 14:11:47

<?xml version="1.0" standalone="yes"?>
<Paper uid="P85-1013">
  <Title>MODULAR LOGIC GRAMMARS</Title>
  <Section position="1" start_page="0" end_page="0" type="metho">
    <SectionTitle>
MODULAR LOGIC GRAMMARS
</SectionTitle>
    <Paragraph position="0"/>
  </Section>
  <Section position="2" start_page="0" end_page="0" type="metho">
    <SectionTitle>
ABSTRACT
</SectionTitle>
    <Paragraph position="0"> This report describes a logic grammar formalism, Modular Logic Grammars, exhibiting a high degree of modularity between syntax and semantics. There is a syntax rule compiler (compiling into Prolog) which takes care of the building of analysis structures and the interface to a clearly separated semantic interpretation component dealing with scoping and the construction of logical forms. The whole system can work in either a one-pass mode or a two-pass mode. \[n the one-pass mode, logical forms are built directly during parsing through interleaved calls to semantics, added automatically by the rule compiler. \[n the two-pass mode, syntactic analysis trees are built automatically in the first pass, and then given to the (one-pass) semantic component. The grammar formalism includes two devices which cause the automatically built syntactic structures to differ from derivation trees in two ways: \[I) There is a shift operator, for dealing with left-embedding constructions such as English possessive noun phrases while using rightrezursive rules (which are appropriate for Prolog parsing). (2) There is a distinction in the syntactic formalism between strong non-terminals and weak non-terminals, which is important for distinguishing major levels of grammar.</Paragraph>
  </Section>
  <Section position="3" start_page="0" end_page="104" type="metho">
    <SectionTitle>
I. INTRODUCTION
</SectionTitle>
    <Paragraph position="0"> l'he term logic grammar will be used here, in the context of natural language processing, to mean a logic programming system (implemented normally in PPSolog), which associates semantic representations Cnormally in some version of preaicate logic) with natural language text. Logic grammars may have varying degrees on modularity in their treatments of syntax and semantics. Th, ere may or may not be an isolatable syntactic component.</Paragraph>
    <Paragraph position="1"> In writing metamorpilosis grammars (Colmerauer, 1978), or definite clause grammars, DCG's, (a special case of metamorphosis grammars, Pereira and Warren. 1980), it is possible to build logical forms directly in the syntax rules by letting non-terminals have arguments that represent partial logical forms being manipulated. Some of the earties= logic grammars (e.g., Dahl, 1977) used this approach. There is certainly an appeal in being dicect, but there are some disadvantages in this lack of modularity. One disadvantage is that it seems difficulZ to get an adequate treatment of the scoping of quantifiers (and more generally focalizers, McCord, 1981) when the building of logical forms is too closely bonded to syntax. Another disadvantage is just a general result of lack of modularity: it can be harder to develop and understand syntax rules when too much is going on in them.</Paragraph>
    <Paragraph position="2"> The logic grammars described in McCord (1982, 1981) were three-pass systems, where one of the main points of the modularity was a good treatment of scoping. The first pass was the syntactic component, written as a definite clause grammar, where syntactic structures were explicitly built up in the arguments of the non-terminals. Word sense selection and slot-filling were done in this first pass, so that the output analysis trees were actually partially semantic. The second pass was a preliminary stage of semantic interpretation in which the syntactic analysis tree was reshaped to reflect proper scoping of modifiers. The third pass took the reshaped tree and produced logical forms in a straightforward way by carrying out modification of nodes by their daughters using a modular system of rules that manipulate semantic items -- consisting of logical forms together with terms that determine how they can combine.</Paragraph>
    <Paragraph position="3"> The CHAT-80 system (Pereira and Warren, 1982, Pereira, 1983) is a three-pass system. The first pass is a purely syntactic component using an extrapositJon grammar (Pereira, 1981) and producing syntactic analyses in righ~ost normal form. The second pass handles word sense selection and slotfilling, and =he third pass handles some scoping phenomena and the final semantic interpretation.</Paragraph>
    <Paragraph position="4"> One gets a great deal of modularity between syntax and semantics in that the first component has no elements of semantic interpretation at all.</Paragraph>
    <Paragraph position="5"> In McCocd (1984) a one-pass semantic interpretation component, SEM, for the EPISTLE system {Miller, Heidorn and Jensen, 1981) was described.</Paragraph>
    <Paragraph position="6"> SEM has been interfaced both to the EPISTLE NLP grammar (Heidorn, 1972, Jensen and Heidorn, 1983), as well as to a logic grammar, SYNT, written as a DCG by the author. These grammars are purely syntactic and use the EPISTLE notion (op. cir.) of approximate parse, which is similar to Pereira's notzon of righ~s~ normal form, but was developed independently. Thus SYNT/SEM is a two-pass system with a clear modularity between syntax and semantics. null  In DCG's and extraposition grammars, the building of analysis structures .(either logical forms or syntactic trees) must be specified explicitly in the syntax rules. A certain amoun~ of modularity is then lost, because the grammar writer must be aware of manipulating these structures, and the possibility of using the grammar in different ways is reduced. \[n Dahl and McCord (1983), a logic grammar formalism was described, modifier structure grammars (HSG's), in which structure-building (of annotated derivation trees) is implicit in the formalism. MSG's look formally like extraposition grammars, with the additional ingredient that semantic items (of the type used in McCord (1981)) can be indicated on the left-hand sides of rules, and contribute automatically to the construction of a syntactico-semantic tree much like that in HcCord (1981). These MSG's were used interpretively in parsing, and then (essentially) the two-pass semantic interpretation system of McCord (1981) was used to get logical forms. So, totally there were three passes in this system.</Paragraph>
    <Paragraph position="7"> \[n this report, \[ wish to describe a logic grammar system, modular logic grammars (MLG's), with the following features: There is a syntax rule compiler which takes care of the building of analysis structures and the interface to semantic interpretation.</Paragraph>
    <Paragraph position="8"> There is a clearly separated semantic interpretation component dealing with scoping and the construction of logical forms.</Paragraph>
    <Paragraph position="9"> The whole system (syntax and semantics) can work optionally in either a one-pass mode or a two-pass mode.</Paragraph>
    <Paragraph position="10"> In the one-pass mode, no syntactic structures are built, but logical forms are built directly during parsing through interleaved calls to the semantic interpretation component, added automatically by the rule compiler.</Paragraph>
    <Paragraph position="11"> in the two-pass mode, the calls to the semantic interpretation component are not interleaved, but are made in a second pass, operating on syntactic analysis trees produced (automatically) in the first pass.</Paragraph>
    <Paragraph position="12"> The syntactic formalism includes a t device, called the shift operator, for dealing with left-embedding constructions such as English possessive noun phrases (&amp;quot;my wife's brother's friend's car&amp;quot;) and Japanese relative clauses.</Paragraph>
    <Paragraph position="13"> ~ne shift operator instructs the rule compiler to build the structures appropriate for leftembedding. These structures are not derivation trees, because the syntax rules are right-recursive, because of the top-down parsing associated with Prolo E.</Paragraph>
    <Paragraph position="14"> There is a distinction in the syntactic formalism between strong non-terminals and weak non-terminals, which is important for distinguishing major levels of grammar and which simplifies the. working of semantic interpretation. This distinction also makes the (automatically produced) syntactic analysis trees much more readable and natural linguistically.</Paragraph>
    <Paragraph position="15"> In the absence of shift constructions, these trees are like derivation trees, but only with nodes corresponding to strong non-terminals.</Paragraph>
    <Paragraph position="16"> \[n an experimental MLG, the semantic component handles all the scoping phenomena handled by that in McCord (1981) and more than the semantic component in McCord (1984). The logical form language is improved over that in the previous systems.</Paragraph>
    <Paragraph position="17"> The MLG formalism allows for a great deal of modularity in natural language grammars, because the syntax rules can be written with very little awareness of semantics or the building of analysis structures, and the very same syntactic component can be used in either the one-pass or the two-pass mode described above.</Paragraph>
    <Paragraph position="18"> Three other logic grammar systems designed with modularity in mind are Hirschman and Puder (1982), Abramson (1984) and Porto and Filgueiras (198&amp;).</Paragraph>
    <Paragraph position="19"> These will be compared with MLG's in Section 6.</Paragraph>
  </Section>
  <Section position="4" start_page="104" end_page="106" type="metho">
    <SectionTitle>
2. THE MLG SYNTACTIC FORMALISM
</SectionTitle>
    <Paragraph position="0"> The syntactic component for an MLG consists of a declaration of the strong non-terminals, followed by a sequence of MLG syntax rules. The dec\[aration of strong non-terminals is of the form strongnonterminals(NTI.NT2 ..... NTn.nil).</Paragraph>
    <Paragraph position="1"> where the NTi are the desired strong non-terminals (only their principal functors are indicated).</Paragraph>
    <Paragraph position="2"> Non-terminals that are not declared strong are called weak. The significance of the strong/weak distinction will be explained below.</Paragraph>
    <Paragraph position="3"> MLG syntax rules are of the form A ~---&gt; B where A is a non-terminal and B is a rule body. A rule body is any combination of surlCace terminals, logical terminals, goals, shifted non-terminals, non-tprminals, the symbol 'nil', and the cut symbol '/', using the sequencing operator ':' and the 'or' symbol 'l' (We represent left-to-right sequencing with a colon instead of a comma, as is often done in logic grammars.) These rule body elements are Prolog terms (normally with arguments), and they are distinguished formally as follows.</Paragraph>
    <Paragraph position="4"> A su~e terminal is of the form +A, where A is any Prolog term. Surface terminals correspond to ordinary terminals in DCG's (they match elements of the surface word string), and the notation is often \[A\] in DCG's.</Paragraph>
    <Paragraph position="5"> A logical terminal is of the form 0p-L~, where Op is a modification operator and LF is a logical form. Logical terminals are special cases of semantic items, the significance of which will be explained below. Formally, the rule compiler  recognizes them as being terms of the form A-B.</Paragraph>
    <Paragraph position="6"> There can be any number of them in a rule body.</Paragraph>
    <Paragraph position="7"> A goal is of the form $A, where A is a term representing a Prolog goal. (This is the usual provision for Prolog procedure calls, which are often indicated by enclosure in braces in DCG's.) A shifted non-terminal is either of the form%A, or of the form F%A, where A is a weak non-terminal and F is any ~erm. (In practice, F will be a list of features.) As indicated in the introduction, the shift operator '~' is used to handle left-embedding constructions in a right-recursive ~ule system.</Paragraph>
    <Paragraph position="8"> Any rule body element not of the above four forms and not 'nil' or the cut symbol is taken to be a non-terminal.</Paragraph>
    <Paragraph position="9"> A terminal is either a surface terminal or a logical ~erminal. Surface ~erminals are building blocks for the word string being analyzed, and logical terminals are building blocks for the amalysis structures.</Paragraph>
    <Paragraph position="10"> A syntax rule is called strong or weak, .,ucording as the non-terminal on its left-hand side is strong or weak.</Paragraph>
    <Paragraph position="11"> It can be seen that on a purely formal level, the only differences between HLG syntax rules and DCG's are (1) the appearance of logical terminals in rule bodies of MLG's, (2) the use of ~he shift operator, and (3) the distinction between strong and weak non-terminals. However, for a given linguistic coverage, the syntactic component of an MLG will normally be more compact than the corresponding DCG because structure-building must be ,~xplicit in DCG's. In this report, the arrow '--&gt;' (as opposed to ':&gt;') will be used for for DCG rules, and the same notation for sequencing, terminals, etc.. will be used for DCG's as for MLG's.</Paragraph>
    <Paragraph position="12"> What is the significance of the strong/weak distinction for non-terminals and rules? Roughly, a strong rule should be thought of as introducing a new l(r)vel of grammar, whe\[eas a weak rule defines analysis within a level. Major categories like sentence and noun phrase are expanded by strong rules, but auxiliary rules like the reoursive rules that find the postmodifiers of a verb are weak rules. An analogy with ATN's (Woods, 1970) is t~at strong non-tecminals are like the start categories of subnetworks (with structure-building POP arcs for termination), whereas weak non-terminals are llke internal nodes.</Paragraph>
    <Paragraph position="13"> In the one-pass mode, the HLG rule compiler makes the following distinction for strong and weak rules. In the Horn clause ~ranslatiDn of a strong ~11e, a call to the semantic interpretation component is compiled in at the end of the clause. The non-terminals appearing in rules (both strong and weak) are given extra arguments which manipu!aKe semantic structures used in the call to semantic interpretation. No such call to semantics is compiled in for weak rules. Weak rules only gather information to be used in the call to semantics made by the next higher strong rule. (Also, a shift generates a call to semantics.) In the two-pass mode, where syntactic analysis trees are built during the first pass, the rule compiler builds in the construction of a tree node corresponding to every strong rule. The node is labeled essentially by the non-terminal appearing on the left-hand side of the strong rule. (A shift also generates the construction of a tree node.) Details of rule compilation will be given in the next section.</Paragraph>
    <Paragraph position="14"> As indicated above, logical terminals, and more generally semantic items, are of the form Operator-LogicalForm.</Paragraph>
    <Paragraph position="15"> The Operator is a term which determines how the semantic item can combine with other semantic items during semantic interpretation. (In this combination, new semantic items are formed which ;ire no longer logical terminals.) Logical terminals are most typically associated with lexical items, although they ar~ also used to produc~, certain non-lexical ingredients in logical form analysis. An example for the lexical item &amp;quot;each&amp;quot; might be Q/P - each(P,Q).</Paragraph>
    <Paragraph position="16"> Here the operator Q/P is such that when the &amp;quot;each&amp;quot; item modifies, say, an item having logical form man(X), P gets unified with man(X), and the resulting semantic item is</Paragraph>
    <Paragraph position="18"> where @q is an operator which causes Q to get unified wi~h the logical form of a further modificand.</Paragraph>
    <Paragraph position="19"> Details ,Jr the dse of semantic items will be given in Section A.</Paragraph>
    <Paragraph position="20"> Now let us look at the syntactic component of a sample HLG which covers the same ground as a welt-known DCG. The following DCG is taken essentially from Pereira and Warren (1980). It is the sort of DCG that builds logical forms directly Dy manipulating partial logical forms in arguments of the grammar symbols.</Paragraph>
    <Paragraph position="22"> ~(john). nm(mary).</Paragraph>
    <Paragraph position="23">  dt(every,P1,P2,all(P1,P2)). dt(a,PI,P2,ex(Pl,P2)). tv(loves,X,Y,love(X,Y)). iv(lives,X,live(X)). The syntactic component of an analogous HLG is as follows. The lexicon is exactly the same as that of the preceding DCG. For reference below, this grammar will be called MLGRAH.</Paragraph>
    <Paragraph position="24"> strongnonterminals(sent.np.relclause.det.nil). sent ~&gt; np(X): vp(X).</Paragraph>
    <Paragraph position="26"> This small grammar illustrates all the ingredients of HLG syntax rules except the shift operator. The shift will be illustrated below. Note that 'sent' and 'np' are strong categories but 'vp' is weak.</Paragraph>
    <Paragraph position="27"> A result is that there will be no call to semantics at the end of the 'vp' rule. Instead, the semantic structures associated with the verb and object are passed up to the 'sent' level, so that the subject and object are &amp;quot;thrown into the same pot&amp;quot; for semantic combination. (However, their surface order is not forgotten.) There are only two types of modification operators appearing in the semantic items of this MLG: 'I' and P2/PI. The operator 'i' means 'leftconlotn . Its effect is to left-conjoin its associated logical form to the logical form of the modificand (although its use in this small grammar is almost trivial). The operator P2/PI is associated with determiners, and its effect has been illustrated above.</Paragraph>
    <Paragraph position="28"> The semantic component will be given below in Section &amp;. A sa~_ple semantic analysis for the sentence &amp;quot;Every man that lives loves a woman&amp;quot; is all(man(Xl)&amp;live(Xl),ex(woman(X2),love(Xl,X2))).</Paragraph>
    <Paragraph position="29"> This is the same as for the above DCG. We will also show a sample parse in the next section.</Paragraph>
    <Paragraph position="30"> A fragment of an MLG illustrating the use of the shift in the treatment of possessive noun phrases is as follows: np ~---&gt; deC: npl.</Paragraph>
    <Paragraph position="31"> npl =&gt; premods: noun: np2.</Paragraph>
    <Paragraph position="32"> vp2 ~&gt; postmods.</Paragraph>
    <Paragraph position="33"> np2 ~&gt; poss: %npl.</Paragraph>
    <Paragraph position="34"> _The idea of this fragment can be described in a rough procedural way, as follows. In parsing an np, one reads an ordinary determiner (deC), then goes to npl. In npl, one reads several premodifiers (premods), say adjectives, then a head noun, then goes to np2. \[n np2, one may either finish by reading postmodifiers (postmods), OR one may read an apostrophe-s (poss) and then SHIFT back to npl. Illustration for the noun phrase, &amp;quot;the old man's dusty hat&amp;quot;: the old man 's np det npl premods noun np2 poss %npl dusty hat (nil) premods noun np2 postmods When the shift is encountered, the syntactic structures (in the two-pass mode) are manipulated (in the compiled rules) so that the initial np (&amp;quot;the old man&amp;quot;) becomes a left-embedded sub-structure of the larger np (whose head is &amp;quot;hat&amp;quot;). But if no apostrophe-s is encountered, then the structure for &amp;quot;the old man&amp;quot; remains on the top level.</Paragraph>
  </Section>
  <Section position="5" start_page="106" end_page="109" type="metho">
    <SectionTitle>
3. COMPILATION OF MLG SYNTAX RULES
</SectionTitle>
    <Paragraph position="0"> In describing rule compilation, we will first look at the two-pass mode, where syntactic structures are built in the first pass, because the relationship of the analysis structures to the syntax rules is more direct in this case.</Paragraph>
    <Paragraph position="1"> The syntactic structures manipulated by the compiled rules are represented as syntactic items, which are terms of the form syn(Features,Oaughters) where Features is a feature list (to be defined), and Daughters is a list consisting of syntactic items and terminals. Both types of terminal (surface and logical) are included in Daughters, but the displaying procedures for syntactic structures can optionally filter out one or the other of the two types. A feature list is of the form nt:Argl, where nt is the principal fun=tot of a strong non-terminal and Argl is its first argument. (If nt has no arguments, we take Argl=nil.) It is convenient, in large grammars, to use this first argument Argl to hold a list (based on the operator ':') of grammatical features of the phrase analyzed by the non-terminal (like number and person for noun phrases).</Paragraph>
    <Paragraph position="2"> \[n compiling DCG rules into Prolog clauses, each non-terminal gets two extra arguments treated as a difference list representing the word string analyzed by the non-terminal. In compiling MLG rules, exactly the same thing is done to handle word strings. For handling syntactic structures, the MLG rule compiler adds additional arguments which manipulate 'syn' structures. The number of additional arguments and the way they are used depend on whether :he non-terminal is strong or weak. If the original non-terminal is strong and has the form nt(Xl .... , Xn) then in the compiled version we will have  nt(Xl ..... Xn, Syn, Strl,Str2).</Paragraph>
    <Paragraph position="3"> Here there is a single syntactic structure argument, Syn, representing the syntactic structure of the phrase associated by nt with the word string given by the difference list (Strl, Sir2).</Paragraph>
    <Paragraph position="4"> On the other hand, when the non-terminal nt is weak, four syntactic structure arguments are added, producing a compiled predication of the form nt(Xl, .... Xn, SynO,Syn, Hodsl,Hods2, Strl,Str2). Here the pair (Hodsl, Hods2) holds a difference list for the sequence of structures analyzed by the weak non-terminal nt. These structures could be 'syn' structures or terminals, and they will be daughters (modifiers) for a 'syn' structure associated with the closest higher call to a strong non-terminal -- let us call this higher 'syn structure the matrix 'syn' structure. The other pair (SynO, Syn) represents the changing view of what the matrix 'syn' structure actually should be, a view that may change because a shift is encountered while satisfying nt. SynO represents the version before satisfying nt, and Syn represents the version after satisfying nt. If no shift is encountered while satisfying nt, then Syn will just equal SynO. But if a shift is encountered, the old version SynO will become a daughter node in the new version Syn.</Paragraph>
    <Paragraph position="5"> In compiling a rule with several non-terminals in the rule body, linked by the sequencing operator ':', the argument pairs (SynO, Syn) and (Hodsl, Hods2) for weak non-terminals are linked, respectively, across adjacent non-terminals in a manner similar to the linking of the difference lists for word-string arguments. Calls to strong non-terminals associate 'syn' structure elements with the modifier lists, just as surface terminals are associated with elements of the word-string lists.</Paragraph>
    <Paragraph position="6"> Let us look now at the compilation of a set of rules. We will take the noun phrase grammar fragment illustrating the shift and shown above in Section 2, and repeated for convenience here, together with declarations of strong non-terminals.</Paragraph>
    <Paragraph position="7"> strongnon~erminals(np.det.noun.poss.nil).</Paragraph>
    <Paragraph position="9"> In the first compiled rule, the structure Syn to be associated with the call to 'np' appears again in the second matrix structure argument of 'npl' The first matrix structure argument of 'npl' is syn(np:nil,Mod:Hods).</Paragraph>
    <Paragraph position="10"> and this will turn out to be the value of Syn if no shifts are encountered. Here Hod is the 'syn' structure associated with the determiner 'det', and Hods is the list of modifiers determined further by 'npi'. The feature list np:nil is constructed from the leading non-terminal 'np' of this strong rule. (It would have been np:Argl if np had a (first) argument Argl.) \[n the second and third compiled rules, the matrix structure pairs (first two arguments) and the modifier difference list pairs are linked in a straightforward way to reflect sequencing.</Paragraph>
    <Paragraph position="11"> \]'be fourth rule shows the effect of the shift.</Paragraph>
    <Paragraph position="12"> Here syn(Feas,HodsO), the previous &amp;quot;conjecture&amp;quot; for the matrix structure, is now made simply the first modifier in the larger structure syn(Feas,syn(Feas,HodsO):Hods2) which becomes the new &amp;quot;conjecture&amp;quot; by being placed in the first argument of the further call to 'npl'. If the shift operator had been used in its binary form FO%npl, then the new conjecture would be syn(NT:F,syn(NT:FO,Mods0):Hods2) where the old conjecture was syn(NT:F,HodsO). \[n larger grammars, this allows one to have a completely correct feature list NT:FO for the left-embedded modifier.</Paragraph>
    <Paragraph position="13"> To illustrate the compilation of terminal symbols, let us look at the rule det =&gt; +O: Sdt(D,PI,P2,P): P2/Pt-P.</Paragraph>
    <Paragraph position="14"> from the grammar HLGRAM in Section 2. The compiled rule is det(syn(det:nil,+D:P2/PI-P:nil), D.Str,Str) &lt;dt(D,PI,P2,P). null Note that both the surface terminal +D and the logical terminal P2/PI-P are entered as modifiers of the 'det' node. The semantic interpretation component looks only at the logical terminals, but in certain applications it is useful to be able to see the surface terminals in the syntactic structures. As mentioned above, the display procedures for syntac=iC/ structures can optionally show only one type of terminal.</Paragraph>
    <Paragraph position="15">  The display of the syntactic structure of the sentence &amp;quot;Every man loves a woman&amp;quot; produced by MLGRAM is as follows.</Paragraph>
    <Paragraph position="17"> Note that no 'vp' node is shown in the parse tree; 'vp' is a weak non-terminal. The logical form produced for this tree by the semantic component given in the next section is all(man(Xl), ex(woman(X2),love(XI,X2))).</Paragraph>
    <Paragraph position="18"> Now let us look at the compilation of syntax rules for the one-pass mode. In this mode, syntactic structures are not built, but semantic structures are built up directly. The rule compiler adds extra arguments to non-terminals for manipulation of semantic structures, and adds calls to the top-level semantic interpretation procedure, 'semant'.</Paragraph>
    <Paragraph position="19"> The procedure 'semant' builds complex semantic structures out of simpler ones, where the original building blocks are the logical terminals appearing in the MLG syntax rules. In this process of construction, it would be possible to work with semantic items (and in fact a subsystem of the rules do work directly with semantic items), but it appears to be more efficient to work with slightly more elaborate structures which we call augmented semantic items. These' are terms of the form sem(Feas,Op,LP), where Op and \[2 are such that Op-LF is an ordinary semantic item, and Fees is either a feature list or the list terminal:nil. The latter form is used for the initial augmented semantic items associated with logical terminals.</Paragraph>
    <Paragraph position="20"> As in the two-pass mode, the number of analysis structure arguments added to a non-terminal by the compiler depends on whether the non-terminal is strong or weak. If the original non-terminal is strong and has the form nt(Xl, ..., Xn) then in the compiled version we will have nt(Xl, ..., Xn, Semsl,Sems2, Strl,Str2).</Paragraph>
    <Paragraph position="21"> Here (Semsl, Sems2) is a difference list of augmented semantic items representing the list of semantic s~ruotures for the phrase associated by n~ with the word s~ring given by the difference list (Strl, Sir2). In the syntactic (two-pass) mode, only one argument (for a 'syn') is needed here, but now we need a list of structures because of a raising phenomenon necessary for proper scoping, which we will discuss in Sections A and 5.</Paragraph>
    <Paragraph position="22"> When the non-terminal nt is weak, five extra arguments are added, producing a compiled predication of the form nt(Xl, ..., Xn, Fees, SemsO,Sems, Semsl,Sems2, Strl,Str2).</Paragraph>
    <Paragraph position="23"> Here Fees is the feature list for the matrix strong non-terminal. The pair (SemsO, Sems) represents the changing &amp;quot;conjecture&amp;quot; for the complete list of. daughter (augmented) semantic items for the matrix node, and is analogous to first extra argument pair in the two-pass mode. The pair (Semsl, Sems2) holds a difference list for the sequence of semantic items analyzed by the weak non-terminal nt. Semsl will be a final sublist of SemsO, and Sems2 will of course be a final sub|ist of Semsl.</Paragraph>
    <Paragraph position="24"> For each strong rule, a cal-i to 'semant' is added at the end of the compiled form of the rule. The form of the call is semant(Feas, Sems, Semsl,Sems2).</Paragraph>
    <Paragraph position="25"> Here teas is the feature list for the non-terminal on the left-hand side of the rule. Sems is the final version of the list of daughter semantic items (after all adjustments for shifts) and (SemsL, Sems2) is the difference list of semantic items resulting from the semantic interpretation for this level. (Think of Fees and Sems as input to 'semant', and (Semsl, Sems2) as output.) CSemsl, Sems2) will be the structure arguments for the non-terminal on the left-hand side of the strong rule. A call to 'semant' is also generated when a shift is encountered, as we will see below. The actual working of 'semant' is the topic of the next section.</Paragraph>
    <Paragraph position="26"> For the shift grammar fragment shown above, the compiled rules are as follows.</Paragraph>
    <Paragraph position="28"> In the first compiled rule (a strong rule), the pair (Seres, SemsO) is a difference list of the semantic items analyzing the noun phrase. (Typically there  will just be one element in this list, but there can be more when modifiers of the noun phrases contain quantifiers that cause the modifiers to get promoted semantically to be sisters of the noun phrase.) This difference list is the output of the call to 'semant' compiled in at the end of the first rule. The input to this call is the list Sems3 (along with the feature list np:nil). We arrive at Sems3 as follows. The list Semsl is started by , ! the call to det ; its first element is the determiner (if there is one), and the list is continued in the list Sems2 of modifiers determined further by the call to 'npl'. In this call to 'npl', the initial list Semsl is given in the second argument of 'npl' as the &amp;quot;initial verslon for the final list of modifiers of the noun phrase. Sems3, being in the next argument of 'npl', is the &amp;quot;final version&amp;quot; of the np modifier list, and this is the list given as input to 'semant'. \[f the processing of 'npl' encounters no shifts, then Sems3 will just equal 5ems I.</Paragraph>
    <Paragraph position="29"> \[n the second compiled rule (for 'npl'), the &amp;quot;versions&amp;quot; of the total list of modifiers are \[inked in a chain (Semsl, 5ems2, Sems3) in the second and third arguments of the weak nonterminals. The actual modifiers produced by this rule are linked in a chain (SemsA, Sems51 Sems6, SemsT) in the fourth and fifth arguments of the weak non-terminals and the first and second arguments of the strong non-terminals. A similar situation holds for the first of the 'np2' rules.</Paragraph>
    <Paragraph position="30"> \[n the second 'npZ' rule, a shift is encountered, so a call to 'semant' is generated. This is necessary because of the shift of levels; the modifiers produced so far represent all the modifiers in an np, and these must be combined by 'semant' to get the analysis of this np. As input to this call to 'semant', we take the list Semsl, which is the current version of the modifiers of the matrix np. The output is the difference list .(Sems2, gems3). Sems2 is given to the succeeding call to 'npl' as the new current version of the matrix modifier list. The tail Sems3 of the difference list output by 'semant' is given to 'npl' in its fourth argument to receive further modifiers. SemsA is the f~.nal uersion of the matrix modifier list, determined by 'npi I , and this information is also put in the third a,'gument of 'np2'. The difference list (Sems5, Semsb) contains the single element produced by 'poss', and this list tails off the list Semsl.</Paragraph>
    <Paragraph position="31"> When a semantic item Op-LF occurs in a rule body, the rule compiler inserts the augmented semantic item sem(terminal:nil,Op,LF). As an example, the weak rule transverb(X,Y) ~&gt; +V: $tv(V,X,Y,P): I-P.</Paragraph>
    <Paragraph position="32"> compiles into the clause</Paragraph>
    <Paragraph position="34"> tv(V,X,Y,P).</Paragraph>
    <Paragraph position="35"> The strong rule det -----&gt; +D: Sdt(D,PI,P2,P): P2/PI-P. compiles into the clause</Paragraph>
    <Paragraph position="37"/>
  </Section>
class="xml-element"></Paper>
Download Original XML