File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/65/c65-1019_concl.xml

Size: 35,642 bytes

Last Modified: 2025-10-06 13:55:51

<?xml version="1.0" standalone="yes"?>
<Paper uid="C65-1019">
  <Title>PUSHDOWN STORES AND SUBSCRIPTS</Title>
  <Section position="3" start_page="19" end_page="19" type="concl">
    <SectionTitle>
1o2o DC and PDS
</SectionTitle>
    <Paragraph position="0"> The crucial step in the derivation of DC by the automaton (for a full description, see Yngve 1960:448_9) is the question asked: Does the right half of the grammar rule in question (GRi) contain the symbol &amp;quot;...&amp;quot; ? (where &amp;quot;.. o&amp;quot; stands for &amp;quot;discontinuity in rewriting the symbol on the left hand side of the rule&amp;quot;) If the answer is Yes, we have to roll out the temporary memory (TM) ta~e one space (in a flow chart, one woul~ sym_ bolize this by the index notation 1 --I --&gt; i, where 1 stands for &amp;quot;leftmost&amp;quot;: &amp;quot;rolling in&amp;quot; tape would then be indicated by 1 + * --&gt; i, see Fig. ~). During this operation, the original content of TM 1 (the leftmost loca_ Mey 5 tion of TM) has to be kept in place, that is, the blank has to occur after the original TM 1 (on the right side, if the tape is thought of as moving from the left, see Figdeg ~). If, how_ ever, the answer is No, we have to make sure that we have space for all the symbols on the right hand side of the rule and roll out tape accordingly. Let ~ be the number of symbols on the right hand side of GRi: then we can symbolize the rolling out by the index formula i -- (n -- ~ the first symbol always goes to the computing register.</Paragraph>
    <Paragraph position="1"> Let further ~ be the subscript for right hand side symbols of GR i. The rest of the operation is then performed as routine counting on GRi. , 3 i being set at 2 (the first symbol has already been taken care of). There should, of course, be a proviso for the symbol &amp;quot;...&amp;quot; itself, so that it will not be copied onto the TM taoe.</Paragraph>
    <Paragraph position="2"> The method as described here will work neatly even in those cases where DC are &amp;quot;nested~ that is, if the expansion of some DC turns out to be another DC (and so on, at least theoretical_ ly). As an example, one may try out the doubly discontinuous as far as the corner, where all the necessary rules are sDeclfied by Yngve himself (1960:449a).</Paragraph>
    <Paragraph position="3"> An implicit assumption throughout the descrip_ tion of the mechanism is that DC can be repres_ ented by the simple formula A --&gt; B + 9,o + C.</Paragraph>
    <Paragraph position="4"> It follows that there are two cases that cannot Mey 6 be handled directly by the machine: the first one can be symbolized by A --&gt; B + ... + C + ... + D (&amp;quot;mul~ple discontinuous constituents&amp;quot;); this reduces easily to double discontinuity by a suitable manipulation of the inout rules.</Paragraph>
    <Paragraph position="5"> The other case could be labeled &amp;quot;discontinu_ ous multiple constituents&amp;quot;: formula A --&gt; B + C + ..o + D (or some variation on this theme), which would imply that the blank has to occur two spaces from leftmost ins~ad of one. Foll_ owing the instructions given by Yngve we would not obtain the right string of symbols in this case (as examples, one may try: He's not that bin a fool, or: As nice a little parlor as ever you did see, or the Soanish sentence: Habla mas de lo que sabe 'He talks more than what he knows' (Bolinger 1957:63), where common sense would prefer the analyses that bi~ ... fool, as nice ( * .. ~arlor, mas de ... que see diagrams in Fig. 2), thus pre3erving analogy with construc_ tions like such a fool etc. The program could be accommodmted to perform this by combining a counting operation with the check on &amp;quot; &amp;quot; o.o , whereafter the continuous part of GRi's right hand side could be thrown in with the non_DC rules. Derivation being different, there would be no interference from constructions like that big fool, that are treated in the normal way by the machine.</Paragraph>
    <Paragraph position="6"> A device like the one described here will, within its obvious limitations, be able to randomly generate sentences that are for the most part quite grammatical (Yngve 1962:70).</Paragraph>
    <Section position="1" start_page="19" end_page="19" type="sub_section">
      <SectionTitle>
Mey 7
</SectionTitle>
      <Paragraph position="0"> The question is: will it generate all, and only, the grammatical sentences of a fang_ uage? I will try to answer this question in the next paragraph.</Paragraph>
      <Paragraph position="1"> 1.3. Limitations of PSG/PD ~ Although the model as proposed by Yngve in its original form only uses the PDS technique to solve a minor problem in syntactic analysis by the machine, the scope and use of PDS are by no means limited to this particular pro_ blem of DC (For a detailed discussion, see Oettinger 1961:126_7). The elegancy and sim_ plicity of PDS algorithms make them well_ suited for procedures of automatic syntactic analysis of languages.</Paragraph>
      <Paragraph position="2"> There are, however, some inherent limitations. Common to all PDS techniques is the fact that information stored in this way only is access_ ible in accordance with the formula &amp;quot;last in, first out&amp;quot;. Being essentially a linear array of information (Oettinger 1961:i04), the user (the machine) will not be able to draw on other information than is given by the leftmost sym_ bol in a left_to_right production (the temDo_ rary memory tape in Yngve's machine, see Fig.l). Since, on the one hand, the machine output is past control (what is Drinted, is no longer available to the machine for inspection) and, on the other hand, the internal state of the machine is entirely determined by the current input symbol, one has to keep careful account Mey 8 not only of the current derivatlonal steps, but also of the &amp;quot;left_overs&amp;quot; from earlier steps. This is exactly what a PDS can do, and the ~roblems in connection with this technique are, as shown above in the case of the so_called discontinuous multiple constituents, are mainly technical (provid_ ing indexes etc.) The linear character of the memory, however, together with the finite state oroperties of the model itself give rise to a~other problem that seems unsolvable under the following ass_ umptions for our machine: a finite number of states, a linear temporary memory, and a transition from one state into anether by one_ symbol inout. The problem is the following: given any internal state of th~achine that is determined by more than one symbol simultane_ ousiy, will the supplementary device of a PDS be able to suDply the necessary instructions to the machine that are not contained in the current symbol? The answer is in the negative, precisely be_ cause the memory is linear, and there is no &amp;quot;look_up&amp;quot; for items in the memorydeg What is stor_ ed in the memory can only be brought up to the surface by something outside the memory itself, that is, I have to create an &amp;quot;expectancy&amp;quot; that is specific for each item in the PDS. Only under these conditions the state of the machine can be defined as determined by the current symbol plus the oontents of the temporary mem_ ory (Yngve 1960:~49). This is essentially the Mey 9 procedure described by Harris for keeping track of nested constructions (&amp;quot;incurrence and discharge of requirements&amp;quot;, Harris 1962: 53). The reason why the machine is able to handle DC is that this &amp;quot;nesting&amp;quot; occurs in one level, so that the symbols involved can be uniquely determined as belonging to the same dimension of analysis.</Paragraph>
      <Paragraph position="3"> Where &amp;quot;surface structure&amp;quot; is explained only by underlying &amp;quot;deed structure&amp;quot; (Hockett 1959: 246 ff.), the machine will not be able to carry out the analysis correctly. The structure that underlies a symbol X 1 may be bound up with a special PS derivation, so that rules concern_ ing structures like, say, X 1 + X 2 + X~ will be ambiguous in their a~plication. One could place restrictions (in Harris' sense) on (one of) the symbols, thus creating a multiple path through the derivation, possibly combined with a cycling device: this is what the subscriot technique does, see 2.4 for a detailed discuss_ ion. Some of the difficulties are removed in this way, but others persist, like those cases where pairs of symbol formulae are involved (the so_called &amp;quot;~eneralized transformations&amp;quot; of early TG, Chomsky 1957:113); this point is also discussed below. While placing too many restr_ ictions on the symbols has serious disadvant_ ages (some of which will be discussed in sect_ ion 2 of this paper), it certainly exceeds the capacity of the model as described by Yngve: his rules are all of the context_ free form.</Paragraph>
      <Paragraph position="4"> Mey I0 Thus, structure in a sufficiently powerful PSG is not only a matter of specifying the right rules, but also of choosing the right rules and combining them at the right places.</Paragraph>
      <Paragraph position="5"> There is still another factor that we have left out of consideration so far: the order_ ing of the rules. Yngve states that any order will do: an alphabetical order may be conven_ lent (1960:445)o NOW this has two consequen_ ces: first, all of the rules have to be run through every time a symbol is expanded (per_ haps only a minor drawback in a computer_ oriented analysis), second, the advantages of ordered rules (economy, elegancy, accuracy) are lost (&amp;quot;forcing all kinds of low_level detail into the rules&amp;quot; , Bach 1964:53) . Besides, ordering of the rules is indispensable in cases where complicated high_level structural descriptions are involved: thus an immediate derivation of each non_terminal symbol all the way down to word level would not be permitted in any kind of PSG, not even the most context_ sensitive ones. Being es~entially context_free, Yngve's grammar will 6~enerate what is usually called &amp;quot;kernel sentences&amp;quot; (Chomsky 1963:152): unambiguous derivation of more complex struct_ urea (derived sentences) will only be feasible under a careful specification of the order in which the rules have to apply (as an example, cf. the discussion of w__hh_transformations as depending on the interrogative transformation in Chomsky 1963:140).</Paragraph>
      <Paragraph position="6"> Mey ii There is another way out of the difficulties that have been sketched in this section: phrase_ structurizing at different levels, these being kept together by the representation relation (see Sgall 1964b). This solution is based on a somewhat different interprd~tion of PSG functions (not only syntactic, but also semant_ ic rules az'e included); a PDS is coupled with the PSG of the lowest level. A detailed dis_ cusslon of this system will have to wait for more details, but it seems that grammars based on dependency relations have received too little attention so far (for a compalison of IC and dependency theories, see Hays 1964: 519_22)o 1.4. Grammar and psycholqgy Referring to experiments performed by G.Ao Mill~r, Yngve establishes an analogy between the &amp;quot;depth&amp;quot; of memory ~n the human brain and the depth of sentence construction in the model (1960:452). The human brain is not capable of stvrlng more than, say, seven plus minus two items a t a time (for references, see Yngve ibid.). In other words, the human brain has a limited capacity, just like the temporary mem_ ory of Yngve's machine. One of the conditions to be put on a flawless handling of &amp;quot;deep&amp;quot; constructions is that the storage capacity is not exceeded by the number of symbols to be developed later on. In this connection Yngve makes the in~ere~ting observation that senten_ ces and constructions in general actually do Mey 12 have a sort of limited depth, i.e. the number of regressive nodes is bound by more or less the same uoper limit as that for human memory's simultaneous storage caoacity.</Paragraph>
      <Paragraph position="7"> Now, I think that the analogy between the two kinds of &amp;quot;storage&amp;quot; should not be overstressed. It rests primarily on the tacit assumption that the model should, or could, be considered as a more or less true_to_life representation of human linguistic activity. As I have remark_ ed before, this supposition is altogether groundless, and will at best hamper an exola_ nation of such activity in truly linguistic terms. A remark made by Yn{~ve in this connect_ ion may clarify the issue. Yngve says (1960: 452b; see also 1961:135_6 for an even more ex_ plicit commitment): &amp;quot;The depth limitation does not apply to algebra, for example, because it is not a spoken langua_ ge. The user has paper available for tempmrary storage . &amp;quot; But so has the user of any other language, e.g, human everyday sooken language. The fact that we do not use paper actually when speaking has nothing to do with greater or lesser depth of sentences (or, if it does, the depth differences occur only to one side, namely that of decreas_ ing depth). One could pursue this analogy ad ab_ surdum by assuming two kinds of depth, one un_ limited, for written languages, and one limited, for spoken languages. The results would be dis_ astrous for any description of any language: sentences of the type: &amp;quot;That that that they are Mey 13 both isosceles is true is obvious isn't clear&amp;quot; (Yngve 1960:458b) are as ungrammatical in written as they are in spoken English. Of course Yngve is perfectly right in attribut_ ing the difference between the above non_ grammatical (deep regressiv~ that_clause and its grammatical (progressive) counterpart: &amp;quot;It isn't clear that it is obvious that it is true that they are both isosceles&amp;quot; to ex_ cess depth. So, there is a depth limitation and this limitation is gramatically relevant.</Paragraph>
      <Paragraph position="8"> But this linguistically fruitful concept should not be confounded with hypotheses from des_ criptive psychology.</Paragraph>
      <Paragraph position="9"> That the claim for descriptive similarity be_ tween psychology and linguistics is latent in Yngve's model can be seen from another instan_ ce. \]Iis second assumption for the model (1960: 445) is that &amp;quot;the model should share with the human speaker o.. the prooerty that words are {~roduced one at a time in the proper time se_ quence, that is, in left_to,right order ...&amp;quot; (the first assumption, vim. that any shortcom_ ings of the PS model can be overcome, has Dart_ 1y been dealt with above, and will be treated at length in the second half of this paper).</Paragraph>
      <Paragraph position="10"> This restriction, I think, on a model (or a grammar, insofar as the grammar is based on the model) is unnecessary and self_contradict_ ory. It is unnecessary, since the model should only copy relevant traits in the speech pro_ duction of the individual; and even though it may be true that words are produced in a linear Hey 14 sequence (as already Saussure has remarked), it has not yet been shown how this linearity is to be interpreted in human speech production: I think it is only weakly relevant, that is to ssy, linearity alone will never suffice to give a complete picture of the speech event.</Paragraph>
      <Paragraph position="11"> For a full_fledged description of speech I suppose the assumption that we speak in senten_ cesra ther than in words will have many advant_ ages.</Paragraph>
      <Paragraph position="12"> Moreover, the claim that the model should du_ plicate the property of left_to_right product_ ion in the human speaker cannot be brought to harmon2ze with the model. In fact, the model can only examine one symbol at a time: the machine may erase or delete or read only that section of the memory tape that is closest to the roll, i.e. the leftmost symbol only (Yngve 196o:446). Now, the limitation of human memory is on re_ I)roducing more than a certain number of items at the same time. The analogy clearly does not hold between human memory and machine storage: the explanation is that the machine produces symbols, whereas the speech of humans is struct_ ured. In other words, a left_to_right product_ ion may in many cases be explained by a linear structure in the oroducer; the pushdown store is a linear memory device. But there are other left_to_right productions that are structured in such a way that a PDS or other left_to_right arrangements will not suffice. It is of course true that a structural description is not alto_ gether absent from a PSG/PDS: Yngve's machine produces as its output a string of symbols Mey 15 containing both syntactical markers (&amp;quot;flattened_ out trees&amp;quot;) and terminal symbols. This will suffice to &amp;quot;infer the derivational history of each string from that string in a single way&amp;quot; (S~all 1963:41), but only insofar as the struct_ ure can be described in one_level terms, cf.</Paragraph>
      <Paragraph position="13"> discussion above (see also Sgall 1963; 1964a).</Paragraph>
      <Paragraph position="14"> The question will be treated at length in part two of this paper.</Paragraph>
      <Paragraph position="15">  The subscript method referred to here is not in the first place thought of a s a machine v program (even though its close affinity ~ith the computer language COMIT is asserted, see Harman 1963:608fn.). Accordingly, it has a more general scope: namely, to offer a full_ fledged alternative, in PS form, to other grammars (e.g. of transformstlonal obedience), thereby proving that &amp;quot;transformational gramm_ ar has no adva ntage over the phrase struct_ ure grammar&amp;quot; (Harman 1963:598).</Paragraph>
      <Paragraph position="16"> * ubscripts are added to the PSG rules in two ways: first, to introduce restrictions on such rules, second, to s:~ecify where those restr_ ictions apply. An example of the first kind is the rule S --&gt; S1/NUMBER_SG (Harman 1963:609), and, in general, any rule of the type A --&gt; B/J + o,o . The second case obtains e.g. in the following rule: NP/NOT_WI{ --&gt; DETERMINER + NOUN, and, of course, in all rules where subscripts Mey 16 a re &amp;quot;lost&amp;quot; during expansion. I think there will be a third type as well, even t|iough this is not expressly mentioned in the articLe, namely, sub_ scripts that do both: introduce new subscripts at places indicated by old ones; but this is on_ ly a minor point. More important is the obser_ vation that subscripts can take care of all sorts of const itueJlts, both continuous and dis_ continuous. For the latter, the generation ru_ les are adapted Prom rules suggested by Victor Yngve (IIarman 1963:606; the reference quoted is Yngve 1960). Like in Yngve's model, the rules of PSG/S are unordered: all necessary informa_ tion about when and where to a'iply a rule is contained in the subscripts (which, by the way and perhaps afortiori, are said to occur in an unordered sequence). But, as will be seen from the following paFsgraphs, this &amp;quot;when&amp;quot;and &amp;quot;where&amp;quot; is not only a notational problem: in fact, it is one of the big underlying differences be_ tween PSG and TG. (0n the difficulty of ordering rules in a PSG, see Chomsky 1957:35). A further important difference from other PSG interpreta_ tions is the admission of deletion rules, that is rules of the form A --&gt; @ (Harman 1963:60~); also this point will be discussed at length below.</Paragraph>
    </Section>
    <Section position="2" start_page="19" end_page="19" type="sub_section">
      <SectionTitle>
2.2. Subscripts And Transformations
</SectionTitle>
      <Paragraph position="0"> In general, One cannot deny the possibility of incorporating (by means of subscripts or other devices) some of the information that is con_ tained in a transformational grammar into a Mey 17 ~ne_level grammar of PS type.</Paragraph>
      <Paragraph position="1"> But the grammar thus constructed will never generate all and only the grammatical sent_ enc~s of the language. Either it will generate too little (the normal case for PSG without subscripts or similar devices) or, if it gen_ erates more, it will also generate some non_ grammatical sentences (Harman 1963:611:&amp;quot;... not all sentences constructed in accordance with this grammar 'are well_formed.&amp;quot;) A very simole example will show this. Supoose we ~ant to transform optionally a sentence in_ to its question counterpart. To do this in the PSG/S according to Harman, we have to choose an appropriate expansion of the symbol $2 (the same paths hold for number_ and mode_ restricted S : S1, resp. S2, Harman:600), na_ mely either the second or the fourth rule in 3., the set of expansion rules for $2. We choose the second rule (normal question, the fourth rule concerns wh_questions):</Paragraph>
      <Paragraph position="3"> Now, note two things: in order to conform to the rules for this grammar, we have already added some of the subscripts from Rules 1 and 2 to the symbol $2 (e.g., NUMBER_SG and MODE_ACT). These subscripts, together with the new ones, are to appear on every symbol that is contained in every rule from now on (unless a delete sub_ script is introduced, cf. below). This is nec_ essary, since we cannot let any information that is conveyed by the subscripts be lost, even if Mey 18 it be irrelevant to the symbol in question (such as, say, a MODE restriction on a NP). One can easily imagine that rewrite rules of this type soon become very unwieldy ( even if we do not allow ,urselves to be frightened by the prospects of &amp;quot;millions of rules&amp;quot;, Harman:605). Thus, in rule 7 of this comparatively simple grammar we already have 6 subscripts to each symbol. This number is substantially increased in the more elaborate version of the grammar (see Appendix to Harman's article). This is certainly not what one would call simplicity of description.</Paragraph>
      <Paragraph position="4"> \[Jut objections of this kind can be met by the following consideratiun: even if the multiDli_ cation of entia, i.c. symbols and subscripts, seems without rationale for humans, one can conceive of it as 8 necessity for computer data handling, and the computer certainly does not mind going through all the subscripts, adding some, deleting others, etc., every time a sym_ bol is mentioned or expanded. So, if one has a working program in which these restrictions can he written out as subroutines, and if the com_ i)uter space needed does not exceed that avail_ able, the objection just made does not hold (cf. Harman:61Ofn.: &amp;quot;Many of these grammars are in the form of computer programs for generating actual sentences.&amp;quot;) The other question is far more important. It can be split up into ~C/o parts:  i. Can all the data of the grammar be put into the subscript_restriction schema? 2. Will the subscript_restriction schema not</Paragraph>
      <Paragraph position="6"> put more data into my grammar than wnnted? The first question concerns the adequate re_ presentation of the structure, the other ex_ presses the fear that I may add structure to my grammar, thus oroducin~ sentences that are not grammatical (see Chomsky 1962:514ff.) Adoptin~ a distinction made by Chomsky, I make the following assertion: A PSG/S will serve as a more or less adequate observational and descriptive representation of the facts covered by a normal PSG; as far as TG is concerndd, the structure of the transformational model (how trees mad into trees) will not be represented adequately on the descriptive (and perhaps not even on the observational) level by a PSG/S.</Paragraph>
      <Paragraph position="7"> In no case the PSG/S will attain the level of explanatory adequacy.</Paragraph>
      <Paragraph position="8"> The first Dart of my assertion can easily be proved from the observation that a normal PSG and a PSG/S are strongly equivalent grammars, the only difference being the notation. (On the notion of equivalence, of. also Hays 1965:519).</Paragraph>
      <Paragraph position="9"> In fact, it makes no difference whether one ex_ pands a symbol on the basis of a rule to be af_ fixed to the constituent by means of a sub_ script, or on the basis of a rule contained so_ mewhere else in the grammar. The essential is that ~eration proceeds from left to right, and one symbol is F)roduced at a time. (See discuss_ ion above, 1.2).</Paragraph>
      <Paragraph position="10"> To Drove the other half of the assertion male above, I will try to give an answer to the two_ MeT 20 fold question about representation of struct_ ure. Let's go back to the elementary example of the optional T , and try to imagine how this q is handled in a PSG/S. The main difference be_ tween PSG and TG is that the rules in PSG oper_ ate on symbols, in TG on strings of symbols.</Paragraph>
      <Paragraph position="11"> When I put a subscript on a symbol that is part of a string, and I want to mark off a struct_ ure that is based on several symbols occurring in a certain order, I will have to mark a Ii the symbols of my string in the same way, and this way of :~larking must be unique, i.e. de_ fine a unique path through the rules. This path may, in due course, require additions, deletions, permutations and the like. Now, in TG these op_ erations are carried out after the PS deriv&amp;tion has been completed. In PSG/S, ho~Tever, the cleavage between affirmative and interrogative sentences occurs already in the third rule, where $2 is expanded into NP + VP, VP + NP, respectively (omitting the subscripts). The two derivations follow separate paths through the rules: in terms of tree diagrams, what is left in the one is right in the other of the two trees, In this way, many PSG rules are un_ necessarily duplicated (see above); moreover, the relationship between interrogative and de_ clara~ive sentences, as defined in TG, is reduced to a remote common source of derivation, namely $2. It is not true that &amp;quot;Sentences are ~rans_ formationally related' to the extent that the Mey 21 same choice of restrictions is made in their derivations and if the same lexical c',oices are made where i~ossible&amp;quot; (Harman:608{ sincle quotes are his), unless one takes &amp;quot;'transfor_ mationally related'&amp;quot; in a sense rather differ_ ent from Chomsky's, namely: sentences that have a (partial) oath through the rules in common.</Paragraph>
      <Paragraph position="12"> This is, in fact, the only 'transformational relation' that it is possible to define in a PSG/S, but unfortunately, it is not transfor_ mational. Even in the case that two paths coin_ cide, and coincide altogether, we do not have 'transformational relatedness', but &amp;quot;grammatical similarity&amp;quot; (Harman:6OS). Lexieal choices have nothing to do with this relation: both in PSG and in rG the choice on the lexical level is made after the aoDlication of expansion, reso_ ectively transformational rules. (This is not altogether cor~.ect: lexical choices may be made earlier and thus affect the derivation, but this is beside the point; complex symbols (see Klima 1964) are not taken into consideration here, but they could be built into a PSG as well as into any other generative grammar. I think, e.g., that some com,~lex symbol could be devised to prevent sentences like The man walks the men, that could easily be generated in accordance with the rules described on TIp.</Paragraph>
      <Paragraph position="13"> 609_10 of Harmsn's article.) In my opinion, a PSG/S will never be able to show transformational relationships as formally defined and described by Chomsky and oZhers; hence such a grammar, even though it may attain Mey 22 a certain descriptive adequacy, will never give an explanation of the fact that precisely this, and not some other sentence, is t~ansformed into another structure.</Paragraph>
      <Paragraph position="14"> 2o3. Deletion in a PSG Another difficulty in PSG/S concerns the problem of deletion rules. In normal PSG, no deletes are permitted (Chomsky 1961:9)o Harman gives as reason for this restriction that trees must be uniquely recoverable in a I~SG (p.603). This is, however, only part of the motivation. Deletes are not symbols: they cannot be expanded (un_ less one chooses to ex~)and them into deletes, which is obviously useless in a description)deg Whenever a deletion rule occurs, the structure of the derivate is altered in such a way that rules may a~ply which originally should not.</Paragraph>
      <Paragraph position="15"> One could say that deletes are extremely con_ text_sensitive: in !{at;nan's PSG/S, which in reality is a highly restricted PSG, the number of rules having the form A--&gt; Z is very limited indeed, even though the author advocates their use (9.605). In passing, I would like to remark that nearly all of the deletion rules have to do with the ex,oansion of NP/I~H (this subscript occurs only once in the smaller ~rammar, p.609, and should therefore be rejected by the machine, since there are no constituents on which the rule could a~ply.) The real reason why a delete cannot be admitted in a PSG (especially a highly context_sensitive Hey 23 one) is that the rules following the deletion rules should be modified or alte~'ed completely , otherwise it would not be possible to keep the distinction between not_rewritten and re_ written symbols clear: the rules following de_ letion might thus operate on symbols orif~inally belonging to the context. (Note, by the xC/ay, that in the case of wh_words the context l~ro_ blem is somewhat simplified by the fact that these words normally stand at the beginning of a sentence, so that the left context can be thought of as zero.) In our example, the transformational rule for interrogative sent_ ences to be generated from declarative ones operates on a string of symbols that may be symbolized X 1 _ X 2 _ X 3 (Chomsky 1957:i12), carrying it into the shade X 2 _ X I - X 3.</Paragraph>
      <Paragraph position="16"> Now, suppose that in the course of the deriv_ Rtion to non_terminsl symbols (the kernel string) we have a deletlon rule operating, say, on &amp;quot;( Suppose moreover that the non_terminal i deg symbol following X 3 qualifies for the condi_ tions originally put on X 5. The transformatio_ nal rule will then operate on a string X 2 _ X~ _ X4, and carry it into X 3 _X 2 _ X~, thus generating a non_grammatical sentence. I do not pretend that the actual PSG/S as Drooosed and described by Harman in his article ~,,ill generate these sentences: as already sai~l, the grammar makes a very cautious use of deletions, so that sentences like the ones mentioned will not occur. This does not, ho~,yever, invalidate the criticism.</Paragraph>
      <Paragraph position="17"> Mey 24 Subscripts may not only be added in A PSG/S, but also deleted. In this manner a restriction that has been put on a certain rule can be re_ moved (this deletion of subscripts is of course quite another matter than the deletion of sym_ bols discussed above). Subscripts may be su)er_ fluous, such as in Rule 8.1 (p.609), where the subscript AUX_MODAL is removed from the constituent INFINITIVE by the subscript --AUX_MODAL, even though the lexicon would offer no ambiguous rewrites in the case of a non_ removal of the superfluous subscript. One could perhaps wonder why this precaution is taken, since in many other instances superfluous sub_ scripts persist all the way through the deriva_ tion (see discussion above). In other cases, the removal of subscripts can be motivated by the desire to orevent ungrammatical &amp;quot;loops&amp;quot;, i.e. endless recursive expansions that have no justification in the grammar. Thus in Rule 8ol the symbol VP3/AUX_MODAL is expanded into INFINITIVE/deg.. + VP3/AUX_HAVE,--AUY_MODAL, thus preventing another expansion by the same rule of VP 3. If, on the other hand, we wish the symbol in question to be expanded recursively (and according to the latest develo,)ment in TG there should be no difficulty in admitting recurslvity for all symbols, S not excluded: see Klima 1964), we can restart the cycle by wiping our slate, i.e. deleting all the sub_ scripts by means of the instruction ERASE.0TIIERS, to be incorporated as a subscript on the right Mey 25 hand side of the ruleo Naturally, we would ex_ pect a subscript of this kind to occur in those cases where a whole sentence is to be embedded into another hy means of what in early TG was called &amp;quot;generalized transformations&amp;quot; (Chomsky 1957: 113) o Th~ominalizing transformation is an instance i~ind&amp;quot; under ~g in the extended PSG/S (p.613), we find, among others, the entry: NP8 --&gt; Sl/CLAUSE. TYPE:NOMINALIZATION, SUBJ. INo GENITIVE, B, C ,D,E, Z,Y, ERASEoOTHERS This means that all the subscripts originally found on NP8 are to be deleted; the new sub_ scripts deal exclusively with the derivation of the embedded clause (as can easily be veri_ fled from the rules of the PSG/S as given in the Appendix of the article). 1~'hereas TG keeps track of the chan~es to be made by means of a structural description of the pair of kernel sentences involved, together with a formula for sh~/ctural change, in PSG/S we have only a con_ stituent NP to be expanded by means of DS rulesdeg How this NP fits into the stmucture of the ori_ ginal kernel sentence (being essentially its path through the PS derivation) can be fo\] low_ ed in .nSG by tracing back the nodes of the tree representation. In PSG/S, this path is marked by the subscriots added to the NP in question.</Paragraph>
      <Paragraph position="18"> Now, all this information is struck from the record by the removal of the subscripts in ac,:ordance with the instruction ERASE OTHERSdeg ~struCtural descri!)tion of the sentence as a whole is not available: the expansion of NP8 destroyed our bridge back to the original So It is as if we ha~een expanding a constituent while forgetting what it was we were expanding.</Paragraph>
    </Section>
    <Section position="3" start_page="19" end_page="19" type="sub_section">
      <SectionTitle>
Mey 26
2o Conclusion
</SectionTitle>
      <Paragraph position="0"> Of the two models discussed here, the first one (PSG/PDS) has not actually been proposed as a full_scale grammatical mo:Iel, but I have tried to show that the implications of the claim that any shorgcomings of PSG can be ow:rcome lead to difficulties of about the same nature as those encountered in the second molel (PSG/S).</Paragraph>
      <Paragraph position="1"> Descriptive adequacy is not attained in those cases where structural descriptions are rele_ vant for the operation of the rules: neither PSG/PDS nor PSG/S permits one structural descr_ iption to be carried over into another. As one will have noticed, the argument in both cases runs a lon~ the same lines. Moreover, of the several devices proposed by Har:~an to boost the .)ower of PSG, the deletion rule was explicitly rejected on the ground that it would add too much power to the ~rammaro On the other hand, the use of subscripts, no matter how carefully chosen, will not help enlarge the descriptive Dower oi&amp;quot; the gramm,~r (Harman 1963:605) enough to account for all the grammatical sentences of the language. Thus, one_level grammars like the ones discussed above will not attain explanatory adequacy in any case, and in some cases not even descriptive adequacy. &amp;quot;Dieser Versuch /namely, the defense of phrase structure, JM/ verfehlt den entscheidenden Punkt abet in zweifacher }{insicht: Erstens uberschreiten die Regeln Harmans die Kapazitat einer PSGo Und zweitens losen such sie nicht das Problem einer geigneten</Paragraph>
    </Section>
  </Section>
class="xml-element"></Paper>
Download Original XML