File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/85/e85-1013_metho.xml
Size: 16,724 bytes
Last Modified: 2025-10-06 14:11:42
<?xml version="1.0" standalone="yes"?> <Paper uid="E85-1013"> <Title>THE PERPETUAL MOTION MACHINE IS THE BANE OF LIFE IN A PATENT OFFICE A MAN I JUST MET LENT ME FIVE POUNDS</Title> <Section position="2" start_page="0" end_page="0" type="metho"> <SectionTitle> SYNTACTIC APPROACHES </SectionTitle> <Paragraph position="0"> Recent discussion of the issue of how and where to attach right-hand phrases (and more generally, clauses) in sentence analysis was started by the claims of Frasier and Fodor (1979). They offered two rules : (i) Right Association which is that phrases on the right should be attached as low as possible on a syntax tree, thus</Paragraph> </Section> <Section position="3" start_page="0" end_page="0" type="metho"> <SectionTitle> JOHN BOUGHT THE BOOK THAT I HAD BEEN TRYING TO OBT t~/OR SUSAN) </SectionTitle> <Paragraph position="0"> which attaches to OBTAIN not to BOUGHT.</Paragraph> <Paragraph position="1"> But this rule fails for</Paragraph> </Section> <Section position="4" start_page="0" end_page="0" type="metho"> <SectionTitle> JOHN BOUGHT THE BOOK (FOR SUSAN) </SectionTitle> <Paragraph position="0"> which requires attachment to BOUGHT not BOOK.</Paragraph> <Paragraph position="1"> A second principle was then added : (ii) Minimal Attachment which is that a phrase must be attached higher in a tree if doing that minimizes the number of nodes in the tree (and this rule is to take precedence over (i)).</Paragraph> </Section> <Section position="5" start_page="0" end_page="0" type="metho"> <SectionTitle> JOHN CARRIED THE GROCERIES (FOR MARY) </SectionTitle> <Paragraph position="0"> attaching FOR MARY to the top of the tree, rather than to the NP, will create a tree with one less node. Shieber (1983) has an alternative analysis of this phenomenon, based on a clear parsing model, which produces the same effect as rule (ii) by preferring longer reductions in the paining table; i.e., in the present ease, preferring VP <-VNPPPto NP <- NP PP.</Paragraph> <Paragraph position="1"> But there axe still problems with (i) and (ii) taken together, as is seen in :</Paragraph> </Section> <Section position="6" start_page="0" end_page="0" type="metho"> <SectionTitle> SHE WANTED THE DRESS~ THAT RACK) </SectionTitle> <Paragraph position="0"> rather than attaching (ON THAT RACK) to WANTED, as (ii) would cause.</Paragraph> </Section> <Section position="7" start_page="0" end_page="89" type="metho"> <SectionTitle> SEMANTIC APPROACHES (i) Lexieal Preference </SectionTitle> <Paragraph position="0"> At this point Ford et al. (1981) suggested the use of lexical preference, which is conventional case information associated with individual verbs, so as to select for attachment PPs which match that case information. This is semantic information in the broad sense in which that term has traditionally been used in AI. Lexical preference allows rules (i) and (ii) above to be overridden if a verb's coding expresses a strong preference for a certain structure. The effect of that rule differs from system to system: within Shieber's parsing model (1983) that rule means in effect that a verb like WANT will prefer to have only a single NP to its right. The parser then performs the longest reduction it can with the strongest leftmost stack element. So, if POSITION, say, prefers two entities to its right, But this iterative patching with more rules does not work, because to every example, under every rule (i, ii and lexical preference), there are clear and simple counter-examples. Thus, there is :</Paragraph> </Section> <Section position="8" start_page="89" end_page="89" type="metho"> <SectionTitle> JOE TOOK THE BOOK THAT I BOUGHT (FOR SUSAN) </SectionTitle> <Paragraph position="0"> which comes under (i) and there is</Paragraph> </Section> <Section position="9" start_page="89" end_page="89" type="metho"> <SectionTitle> JOE BROUGHT THE BOOK THAT I LOVED (FOR SUSAN) </SectionTitle> <Paragraph position="0"> which Shieber's parser must get wrong and not in a way that (ii) could rescue. Under (ii) itself, there is</Paragraph> </Section> <Section position="10" start_page="89" end_page="89" type="metho"> <SectionTitle> JOE LOST THE TIC~O PARIS) </SectionTitle> <Paragraph position="0"> which Shieber's conflict reduction rule must get wrong. For Shieber's version of lexical preference there will be problems with : DAUGHTER) which the rules he gives for WANT must get wrong.</Paragraph> <Paragraph position="1"> (ii) Schubert Schubert (1984) presents some of the above counter-examples in an attack on syntactically based methods. He proposes a syntactico-semantic network system of what he calls preference trade-offs. He is driven to this, he says, because he rejects any system based wholly on lexically-based semantic preferences (which is part of what we here will call preference semantics, see below, and which would subsume the simpler versions of lexicM preference). He does this on the grounds that there are clear cases where &quot;syntactic preferences prevail over much more coherent alternatives&quot; (Schubert, 1984, p.248), where by &quot;coherent&quot;&quot; he means interpretations imposed by semantics/pragmatics. His examples are : (where full lines show the &quot;natural&quot; pragmatic interpretations, and dotted ones the interpretations that Schubert says are imposed willynilly by the syntax). Our informants disagree with Schubert : they attach as the syntax suggests to LIVE, but still insist that the leave is Mary's (i.e. so interpreting the last clause that it contains an elided (WHILE) SHE WAS (ON....). If that is so the example does not split off semantics from syntax in the way Schubert wants, because the issue is who is on leave and not when something was done. In such circumstances the example presents no special problems. null</Paragraph> </Section> <Section position="11" start_page="89" end_page="89" type="metho"> <SectionTitle> JOHN MET~ HAIRED GIRL FROM MONTREAL THAT HE MARRIED (AT A DANCE) </SectionTitle> <Paragraph position="0"> iv- t Here our informants attach the phrase resolutely to MET as cornmonsense dictates (i.e. they ignore or are able to discount the built-in distance effect of the very long NP). A more difficult and interesting case arises if the last phrase is (AT A WEDDING), since the example then seems to fall withing the exclusion of an &quot;attachment unless it yields zero information&quot; rule deployed within preference semantics (Wilks, 1973), which is probably, in its turn, a close relative of Grice's (1975) maxim concerned with information quantity. In the (AT A WEDDING) case, informants continue to attach to MET, seemingly discounting both the syntactic indication and the information vacuity of MARRIED AT A WEDDING.</Paragraph> <Paragraph position="1"> JOHN WAS NAMED (AFTER HIS TWIN SISTER) Here our informants saw genuine ambiguity and did not seem to mind much whether attachment or lexicalization of NAMED AFTER was preferred. Again, information vacuity tells against the syntactic attachment (the example is on the model of : HE WAS NAMED AFTER HIS FATHER Wilks 1973, which was used to make a closely related point), but normal gendering of names tells against the lexicalization of the verb to NAME+AFTER.</Paragraph> <Paragraph position="2"> Our conclusion from Schubert's examples is the reverse of his own : these are not simple examples but very complex ones, involving distance and (in two cases) information quantity phenomena. In none of the cases do they support the straightforward primacy of syntax that his case against a generalized &quot;lexical preference hypothesis&quot; (i.e. one without rules (i) and (ii) as default cases, as in Ford et al.'s lexicM preference) would require. We shall therefore consider that hypothesis, under the name preference semantics, to be still under consideration.</Paragraph> <Paragraph position="3"> (Ul) Hi~ Hirst (1984) aims to produce a conflation of the approaches of Ford et al., described above, and a principle of Crain and Steedman (1984) called The Principle of Parsimony, which is to make an attachment that corresponds to leaving the minimum number of presuppositions unsatisfied. The example usually given is that of a &quot;garden path&quot; sentence like : THE HORSE RACED PAST THE BARN FELL where the natural (initial) preference for the garden path interpretation is to he explained by the fact that, on that interpretation, only the existence of an entity corresponding to THE HORSE is to be presupposed, and that means less presuppositions to which nothing is the memory structure corresponds than is needed to opt for the existence of some THE HORSE RACED PAST THE BARN. One difficulty here is what it is for something to exist in memory: Craln and Steedman themselves note that readers do not garden path with sentences like :</Paragraph> </Section> <Section position="12" start_page="89" end_page="90" type="metho"> <SectionTitle> CARS RACED AT MONTE CARLO FETCH HIGH PRICES AS COLLECTOR'S ITEMS </SectionTitle> <Paragraph position="0"> but that is not because readers know of any particular cars raced at Monte Carlo. Hirst accepts from (Winograd 1972) a general Principle of Referential Success (i.e. to actual existent entities), hut the general unsatisfactoriness of restricting a system to actual entities has long been known, for so much of our discourse is about possible and virtual ontologies (for a full discussion of this aspect of Winograd. see Ritchie 1978).</Paragraph> <Paragraph position="1"> The strength of Hirst's approach is his attempt to reduce the presuppositional metric of Craln and Steedman to criteria manipulable by basic semantie/lexieal codings, and particularly the contrast of definite and indefinite articles. But the general determination of categories like definite and indefinite is so shaky (and only indirectly related to &quot;the&quot; and &quot;a&quot; in English), and cannot possibly bear the weight that he puts on it as the solid basis of a theory of phrase attachment.</Paragraph> <Paragraph position="2"> So, Hirer invites counter-examples to his Principle of Referential Success (1984, p.149) adapted from Wlnograd: &quot;a non-generic NP presupposes that the thing it describes exists.....an indefinite NP presupposes only the plausibility of what it describes.&quot; But this is just not so in either case :</Paragraph> </Section> <Section position="13" start_page="90" end_page="90" type="metho"> <SectionTitle> THE PERPETUAL MOTION MACHINE IS THE BANE OF LIFE IN A PATENT OFFICE </SectionTitle> <Paragraph position="0"> A MAN I JUST MET LENT ME FIVE POUNDS The machine is perfectly definite but the perpetual motion machine does not exist and is not presupposed by the speaker. We conclude that these notions are not yet in a state to be the basis of a theory of PP attachment. Moreover, even though beliefs about the world must play a role in attachment in certain cases, there is, as yet, no reason to believe that beliefs and presuppositions can provide the material for a basic attachment mechanism.</Paragraph> <Paragraph position="1"> (iv) Preference Semantics Preference Semantics has claimed that appropriate structurings can be obtained using essentially semantic information, given also a rule of preferring the most densely connected representations that can be constructed from such semantic information (Wilks 1975, Fass & Wilks 1983).</Paragraph> <Paragraph position="2"> Let us consider such a position initially expressed as semantic dictionary information attaching to the verb; this is essentially the position of the systems discussed above, as well as of case grammar. and the semantics- based parsing systems (e.g. Riesbeck 1975) that have been based on it. When discussing implementation in the last section we shall argue (as in Wilks 1976) that semantic material that is to be the base of a parsing process cannot be thought of as simply attaching to a verb (rather than to nouns and all other word senses) In what follows we shall assume case predicates in the dictiondeg ary entries of verbs, nouns etc. that express part of the meaning of the concept and determine its semantic relations. We shall write as \[OBTAIN\] the abbreviation of the semantic dictionary entry for OBTAIN, and assume that the following concepts contain at least the case entries shown (as case predicates and the types of argument (recipient hum) recipient case, human.</Paragraph> <Paragraph position="3"> The issue here is whether these are plausible preferential meaning constituents: e.g. that to obtain something is to obtain it for a recipient; null to position something is to do it in association with a place; a ticket (in this sense i.e. &quot;billet&quot; rather than &quot;ticket&quot; in French) is a ticket to somewhere, and so on. They do not entail restrictions, but only preferences. Hence, &quot;John brought his dog a bone&quot; in no way violates the coding \[BRING\]. We shall refer to these case constituents within semantic representations as semantic preferences of the corresponding head concept.</Paragraph> <Paragraph position="4"> A FIRST TRIAL ATTACHMENT RULE The examples discussed are correctly attached by the following rule : Rule A : moving leftwards from the right hand end of a sentence, assign the attachment of an entity X (word or phrase) to the first entity to the left of X that has a preference that X satisfies; this entails that any entity X can only satisfy the preference of one entity. Assume also a push down stack for inserting such entities as X into until they satisfy some preference. Assume also some distance limit (to be empirically determined) and a DEFAULT rule such that, if any X satisfies no preferences, it is attached locally, i.e. immediately to its left.</Paragraph> <Paragraph position="5"> Rule A gets right all the classes of examples discussed (with one exception, see below): e.g where the last requires use of the push-down stack. The phenomenon treated here is assumed to be much more general than just phrases, as in: P~TF. DE CANARD TRUFFI~ ,~..__.~ (i.e. a truflled pate of duck, not a pate of truflled ducks!) where we envisage a preference (POSS STUFF)~---i.e. prefers to be predicated of substances - as part of \[TRUFFE\[. French gender is of no use here, since all the concepts are masculine.</Paragraph> <Paragraph position="6"> This rule would of course have to be modified for many special factors, e.g. pronouns, because of :</Paragraph> </Section> <Section position="14" start_page="90" end_page="90" type="metho"> <SectionTitle> \[ THE DR~ SHE WANTON THE SHELF) </SectionTitle> <Paragraph position="0"> A more substantial drawback to this substitution of a single semantics- based rule for all the earlier syntactic complexity is that placing the preferences essentially in the verbs (as did the systems discussed earlier that used lexical preference) and having little more than semantic type information on nouns (except in cases like \[TICKET\[ that also prefers associated cases) but, most importantly, having no semantic preferences associated with prepositions that introduce phrases, we shall only succeed with rule A by means of a semantic subterfuge for a large and simple class of cases, namely:</Paragraph> </Section> <Section position="15" start_page="90" end_page="91" type="metho"> <SectionTitle> JOHN LOVED HER (FOR HER BEAUTY) or JOHN SHOT THE GIRL (IN THE PARK) </SectionTitle> <Paragraph position="0"> Given the &quot;low default&quot; component of rule A, these can only be correctly attached if there is a very general case component in the verbs, e.g. some statement of location in all &quot;active types&quot; of verbs (to be described by the primitive type heads in their codings) like SHOOT i.e. (location *pla), which expresses the fact that acts of this type are necessarily located. (location *pla) is then the preference that (IN THE PARK) satisfies, thus preventing a low default.</Paragraph> <Paragraph position="1"> Again, verbs like LOVE would need a (REASON ANY) component in their coding, expressing the notion that such states (as opposed to actions, both defined i~ terms of the main semantic primitives of verbs) are dependent on some reason, which could be anything. null But the clearest defect of Rule A (and, by implication, of all the verb- centered approaches discussed earlier in the paper) is that verbs in fact confront not cases, but PPs fronted by ambiguous prepositions, and it is only by taking account of their preferences that a general solution can be found.</Paragraph> </Section> class="xml-element"></Paper>