File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/79/j79-1025_metho.xml

Size: 88,873 bytes

Last Modified: 2025-10-06 14:11:10

<?xml version="1.0" standalone="yes"?>
<Paper uid="J79-1025">
  <Title>PROGRAMMI0NG LANGUAGES</Title>
  <Section position="2" start_page="3" end_page="6" type="metho">
    <SectionTitle>
3 A natural language, such as English, is spscified
</SectionTitle>
    <Paragraph position="0"> iiyntacfically and semantically by defining a set $ constituting the sentences of the laaguage together with a subset k of ,$* x % ((where 'X*' denotes the power set of ), such that , , , . . , L 3 1- I&amp; iff &amp; is implied by the premisses  L L 4 4, ...) -+- L-(n 2, 0, -3. L. E &amp; whenever 0 1 - i ~n). (Jn the limiting case,-the null szt $ k iff I&amp; is analytic.) In practice, the infinitely numerous members of &amp; are generated by a finite set of context-free phrase-structure rules, together with syntactic transformations which operate on the structures defined by those rules. The infinitely numerous members of will be defined by a specification of a relatios between the seneences of &amp; and a set of arrays of symbols called gsernantic representationsg of those sentences, together with tr finite set of rules of inference, similar to those of extant formal logics, which permit the construction of a derivation containing the semantic representation of $ as conclusion and the semantic representations of L , . 5 as premisses -q4, -n just when I&amp;,, , .*., -a L I- &amp;, .</Paragraph>
    <Paragraph position="1"> me 'geneFative semantic- null 4t!Ibe semantic representation(*), on the assumption 'that istsq have argued that: the relation between sentences and their semantic represeatations is defined by the transformational rules revealed by independently-motivated syntactic analysis fe.g. Poetal L'l9701 1971 : 252f. ) ; although this hypothesis is certainly not altogether aorrect' (see e.g. Partee 1971), it seems. likely that the semantic representation o#' a sentence is some simple funetion of its.syn$actic deep structure and its surface structure. The rules of iaference for natural languages will no doubt exhibit the 'structure-dependence' character'iatic of syntactic transformations, as do the rules of inZerence of formal logics, cf. Sampson (1975a: 163-7, forthcoming) (thus, the term 'XsY' - in the standard rule of modus ponens, i.e. '{x -Om =IY, X + . Y , is a strmcCura1 description not of all formulae contaiaing an instance of '3' but only of those in which '' is an immediate c~natituent of the whole formula). For discussion of the philosophical problems involved in this way of describing natural-language semantic analysis, including problems relating to the aaalytic/synthetic distinction, cf. Sampson (1970, 1973a, 1975a: ch. 7).</Paragraph>
    <Paragraph position="2"> 4. It is tempting to view the mind of a speakel' of e.g.</Paragraph>
    <Paragraph position="3"> English as an automaton in tihe defined sense, with the sentences of English as the programs of its machine language, and the rules of inference of English determining the successor-state relatioa. In other words, some component of the mind of an English-speaker would be a d~vice capable of entering any the relation between sentences and semantic representations is a function. In practice it will not be (ambiguous sentences @ll have mope than one semantic representation), so 'the' should read 'one of the ... (respectively)'.</Paragraph>
    <Paragraph position="4"> one of a (perhaps infinitely) large nuhlber of discrete states; hearing (or reading) a sentence would move this device from one state to another in accordance with definite rules; and other wler;! related to the rules of inference of English would govern' how it passes through different states when not immediately reacting to speech (i.,e. when the owner of the mind is thinking). null Although the analogy is tempting, extant computers and their machine languages are not promising as sources for a theory of the relation. between human minds and natural languages. The machine language sketched above is not at all reminiscent of natural languages. The latter typically contain infinitely many sentences, only the simplest of which are used in practice; the machine language of 52 contains an enormous but finite number of programs, and the programs which are useful in practice (those which compute important functions) are not typidally 'simple' in any obvious sense.</Paragraph>
    <Paragraph position="5"> Portunately, the machine languages of .thg various extant computers are not the only artificial programming languages in use. Partly for the very reason that machine languages are so different from natural languages, most programs are written P not in machine languages but in so-called 'high-levtl' .programming languages, sudh as FORTRAN, SNOBOL, APL, PL/? (to name a few among many). We may think of a computer AM suppl jed with a compiler program for some high-level langu'fige dH as A simulating the workings of a very different,computer, sgy E.</Paragraph>
    <Paragraph position="6"> actually exists: high-level languagzs No such computer as 8re not typically me Zachi~le languages of any physical computers, and there are Undoubtedly s'oupd engineering reasons for this.</Paragraph>
    <Paragraph position="7"> But the abstract automaton 4 may be de~cribed just as precisely as the automaton4 whia underlies the real  computer.</Paragraph>
    <Paragraph position="8"> One who programs an $ Isgstemr (i.e. conjunction of computer with deg-compiler) cok-onl~ thinks of the machine he is dealing with zs having the eroperties of , and may be quite unaware of the properties .of the machine-AM with -.</Paragraph>
    <Paragraph position="9"> which he is in fact interacting.</Paragraph>
    <Paragraph position="10"> High-level lapguages, aad the abstract automata whose 'machine languagest they are, differ from one another in more in-teresting ways than do real computers and their machine languages; and furthermore (not surprisingly, since high-level languages are designed to be easily usable by human programmers) they are much more comparable with human languages than are real machine languages. (Typically, a high-level programming language is a context-free phrase-structure language, for instance.) I shall suggest that the relationship betweer high-level languages ahd their corresponding automata gives us much better clues about human mental machinery than does that between real computers and machine languages.</Paragraph>
    <Paragraph position="11"> 50 Let me fiitst give an example of a high-level language: I shall choose the language APL (see e.g. Iverson 1962, Pakin 1968). AP$ is interestbing For our purposes because it is particularly high-level: 5.e. it is related more distantly to machine languages of real computers, and more clolely to human langudges, than many other high-level languages. It is a real-time rather thm batch-processing language, which means that it is designed to be used in such a way that the result of inputting a program will normally be crucially dependent on the prior @-bate of the system (in a batch-processing language, programs are designed to be unaffected by those remains of the prior state which survive their input): this is appropriate for an analogy with human language, singe presumably the effect on a person of hearing a sentence deperids in  ,-general on his prior system of knowledge and belief .' !Be complete language APL includes many features which are irrelevant to our analogy. For instance, there is a large amount of apparatus for making and breaking contact with the system, and the like; we shall ignore this, ~ust as we shall ignore the fact that in human speech the effect of an utterance on a person depends among other things on whether the  person is awake. Also, APL provides what amounts to a method uf using the language to alter itself by adding new vocabulary; to discuss this would again complicate the issues we are interested inm7 We shall assume that programmer and system are permanently contact one another, and shall restrict our attention to a subset of APL to be defined below: raeher than resorting to a subscript to distinguish the restricted language from APL in its full complexity, we shall understand 'gPLr to mean the subset of APL under consideration.</Paragraph>
    <Paragraph position="12"> 'The practicing computer user may find my definition the real-time/batch-processing distinction idiosyncratic; difference I describe is the only one relevant for our pr of the esent purposes, but it is far rrom the most salient difference in practice.</Paragraph>
    <Paragraph position="13">  In AP'L terms, we ignore all system ins~ructions, i.e.</Paragraph>
    <Paragraph position="14"> words beginning with A, Note that we use wm underlinin (corresponding to bold type in print) to quote bssequencee of symbols from an object-language, whether this is an artificial language such as ATL or a natural language such as English.</Paragraph>
    <Paragraph position="15"> 'hn BPL terms, we ignore the dqyinition I - mode and the use of characters V A : + .</Paragraph>
    <Paragraph position="16">  We begin by defining the set</Paragraph>
  </Section>
  <Section position="3" start_page="6" end_page="90" type="metho">
    <SectionTitle>
9 APL of states of &amp;APL.
</SectionTitle>
    <Paragraph position="0"> First, we recursively define a set of APL-prop&amp;ties: any positive or negative real number is a numeric - APL-property of dimension #; any character on the APL keyboard (i.e. any of a-fipite set of characters whose identity does not concern us) is a literal &amp;property of dimension @; for any integer I n and integer-string - D, any - n-length string over the set of numerio (literal) APL-properties of dimedion - D is A a numeric (literal) APL-property of dimension -- D%.' Por any finitg string .I D over the integers, any numeric or literal APL-property of dimension - D is an APL-property, and nothing else is guch. Clearly, there are infinitely many APLproperties.. The length of the dimension of an APL-property is the - rank of that APL-property. !Phus, a number is a rank4 numeric APL-property; a four-letter word, e.g. LOVE, is a rank- null 'The symbol stands for concatenation (a dimension is always an integer-string). Concatenation is a function from so we should strictly write 'DAN9 (where N is set 8 &amp;d any integeE n 5 ~:-ie use the terms-5-tu le of hementz of S' and '1enntE-n' strinp: over S ' interc &amp; angea 1~ for any functrdn from the sewam, 2, .:. , n) of th; natkal numbers into S; note that the null set 0 iztherefore the length4 strgg- overany set.</Paragraph>
    <Paragraph position="1"> with an alphabetic character: there &amp;re therefore infinitely many APL-identifiers. We define Ident as the set including all -identifiers together with an entity, assumed to be distinct from all the APLidentifiers, delloted by. the symbol - 0, An DL-object is a pairing of any member ofsIdent with an APE property; we call the first member of an APL-object the identifier of the object and the second memher the property of the -- object. null An APEstate is a finite set of APL-objects in which no distinct objects bear the same identifier. (We may thus think of an kPEstPt'e as a function from a finite subset of Ident into the set of APEproperties. ) We write I qAPL1 for the set of all -states: clearly, *APL is infinite.</Paragraph>
    <Paragraph position="2"> we now define txe lanwage zAPL of AAPL' de,m is genera$ed by the context-free grammar on p. 16. The initial symbol of that grammar is - asst (for 'assignmentt). Since capital letters occur among the terminal symbols of $ we use miniscules as non-terminals; terminal symbols of 2:; are wavy-underlined (cf. note 6, p. 3, whether they are letters or other characters,  df -+ [any of a large finite set of symbol(- string)^ denoting  partial dyadic functiohs on the set of APL-properties] tf [aqy of a finite see of symbol(- string)^ denoting  partial triadic functions on the set of APL-properties]  deic + [any of a small finite set of symbol-strings denoting  total monadic functions from the set of possible programing-acts into the set of APL-properties] 13 91n practice one cannot write a length-0 string, and one cgnnot distinguish a length-? string from a rank-0 property; but Fhe sentehces of dm-, are the strings defined by the above grammar, disambiguated by the use of round brackets (with assoc@tion to the right where lot indicated by bracketing). The sequence of symbols m+ may optionally be deleted when initMlmC null ial in a sentence. l4 Clearly there &amp;re infinitely many sentences in &amp;APL. A.sentence of hL is an APL-program. We now go 0x1 to specMy the function - IntApL from aeapL into which specifies the *change of APE state brought about by a gi~en APL-program.</Paragraph>
    <Paragraph position="3"> To determine the new state arrived at from an arbitrary I ignore these practical complications for the sake of simplicity. null &amp;quot;1 rgnore complications relating to strings containing the inverted comma character, 'l'l Some of these function^, and their ~ames, are common to all 'dialects* of APL: e.g. 5, which denotes the function taking integers into their factorials, strings of integers into the oorresponding strings of fa~toriala, etc., and which is1 undefined 0.g. for literal APL-properties. !Phe facility of 'user-definition' (cf. note 7) permits a programmer to alter APL+by adding new functions.</Paragraph>
    <Paragraph position="4"> '12AP~ cantazns no triadic functions other than user-defined ones.</Paragraph>
    <Paragraph position="5"> ''~.~r denotes the function taking any programming act into a etring of integers representing the time of day at which it occurs.</Paragraph>
    <Paragraph position="6">  There are a number of syntactic complications, akin to eyntactic transformations in natural languages, concerning a Gadic functf on called index, which is denoted - by square brackets; we ignore these coaations, and shall not consider 'index' apart from the other dyadic function$.</Paragraph>
    <Paragraph position="7"> current state on input of an arbitrary program, we consider the phrase-marker of which that program is the terminal string. Beginning at the leaves and working towards the root, and evaluatinq the rightmoat node whenever there is a choice, we associate each - dacr node with an APL-property as its denotation and each - aest node with a change to be made to the current APEstate: the new APL-state is the one that results from the old state by making all the changes associated with the various asst nodes in the order mentioned, terminating with the change  associated with the root - asst node (which may of course be the only one).</Paragraph>
    <Paragraph position="8"> If at any point a - dacr node ca~ot be assigned a denotation (e.g. because it is realized as an APEidentifier which is not the identifier of any ob$ect in the current state), the etate-changes already made (if any) are the total changes achieved by that program.</Paragraph>
    <Paragraph position="9"> The rules for evaluating nodes are as follows: A - dscr node realized as an identifier denotes the APL property, if any, paired with thaa identsfkr in the current state; a - dser node rewritten as numname or litname denotes the APGproperty denoted by its 'expopent' (i.e. the terminal material it dominates); a - dear node rewritten as - deic denotes the APL-property given by applying the function denoted. by its exponent to the  current programminpc-aot; when a - dscr node d+ dominates an mf node d followed by - n a - decr node d+, if d denotes some monadic function f and d2</Paragraph>
    <Paragraph position="11"> denotes an UEproperty 2 then % denotes the APL-property f(~), provided g e dom(f); the extension to dscr nodes rewrit-</Paragraph>
    <Paragraph position="13"> An - aset node dominating a member - i of Ident, followed by 6, followed by a dscr node denoting an APL-property E, adds</Paragraph>
    <Paragraph position="15"> a new &amp;PLobject (i, - E) to the current state, destroying any object with the identifjer - i which may already exist in the current stat&amp;-; if I i is - a, a representation of 2 is printed out; 15 a - dscr uode imediately,dominating an - aest node which has created an BPEobjeot with the property E denotes 2.</Paragraph>
    <Paragraph position="16"> As a (trivial) example of the operation of these rules, consider the program input to dfL in state {(A, (k 1 1 I), (B, k.r 10)). This program has the constituent structure which appears in Pig. 'I on the next gage (in which - dscr and - asst nodes are numbered in the order they are to be evaluated).</Paragraph>
    <Paragraph position="17"> Suppose the program is input in the morning, say at 11.30 a.m. Then dscr will denote the string 11 30 0. The function  &gt; takes (12 0 0, I1 30 0,) into 1 0 0, which becomes the denotw null ation of dscr in fact dscr will denote 1 0 0 whenever the -+3 1311CCIL3 program is input in the morning and 0 0 0 whenever it is input in the afternoon (when the hour integer will be 14 or more). The monadic function +/ adds the numbers in a string, so if  dscr denotes 'I 0 0 then dscr,, denotes 1. Dscr denotes 10</Paragraph>
    <Paragraph position="19"> of dscr, in which case dscr is assigned an APL-property input by theprogramer at thnme dscr is evaluated by the system, We ignore this, since it interms with the analogy with natural language. IL the full version it is also possible to output symbol-str'ings which do not represent individual APE properties; again we ignore this.</Paragraph>
    <Paragraph position="20">  out 10. The monadic function gives the dimendion of any &amp;PE</Paragraph>
    <Paragraph position="22"> property, and ascr, denotes 1 -1 1, so d~~r~ denotes 3, and hence dsor denotes 13. Pinally, asst addasan object (Q, 13)</Paragraph>
    <Paragraph position="24"> to the current state.</Paragraph>
    <Paragraph position="25"> In other words, if the program ia input in the morning and the prior state is as quoted, the final state will be {(A, I ,I I), (2, lo), ( , lo), (Q, 13)), whereas if it were (u tw input in the afternoon to AAPL in the same prior state the resultag state would be C(A, rn 1 1 I), (2, lo), ( lCrr , 0) , (Q, 3)). m To define fully it remains only to specify the relation Suc,,, On 4LPJi which controls the changes-of-state &amp;r undergoes- spontaneously. SucdpL is the empty relation: every -state is a stopping state. A programmer working in BPL has no wish for tihe system to take actions beyond those specified by his programs: by defining monadic, dyadic, ,or triadic functions of any complexity he wishes, he can get the answers to hie questions simply by carrying out the state-changes specified in his program. (1n the maohine language of a .genuine computer, on the other hand, the state-changes brought about by grograme are of no intrinsicr interest, and the input of a program ie of value only in tha* it brings the computer to a state from whichit proceeds spontaneously to perform actions useful to its programmer. ) 7.</Paragraph>
    <Paragraph position="26"> It may seem.contradictory to say that a real digital computer, which will have only finitely many states and possible programs, can be made to simulate an automaton such as which has infinitely many states and programs. And, of oouree, in practice the simulation ie,not perfect. Although an -state may contain any number of objects, for. any APL computer/compiler system there will be a finite limit on the number of objects in a state; although any real number may be an BPL-property, in a practical APL system real numbers are approximated to a fini-te tolerance.</Paragraph>
    <Paragraph position="27"> The mation is quite analogous to the case of natural language, where the individual's tperfomance ' ie an imperfect realization of an ideal* tcompetence', in one sense of that distinction; just as in linguistics, so in the case of high-level programming languages it is normal to give a description of the ideal system separately from a statement of the limitations on the realizat5,on of that system in practice, which will differ from one person to another in the natural language case, from one computer/compiler pair to another in the programming language case.</Paragraph>
    <Paragraph position="28"> Other high-level programming languages differ from APL not only in terms of their sentences but in terms of the nature of the states on which thei~ sentences act. Thus in states of e.g. SNOBOL, all objects are character strings; in PL/1, objects include not only arrays of the APL kind but also trees, trees of arrays, arrays of trees, etc. Ijpace doea not permit a survey of the differences between high4Level languages with respect to the nature of their states.</Paragraph>
    <Paragraph position="29"> 8. At this point we are ready to begin to answer Apostel's question, about what sort of automata natural languages are appropriate programming languages for. Any answer,to such a novel question must' obviously be very speculative; but the ideas that follow seam plausible enough to be worth consideration. We do not know with any certainty evbn what the semantic representations or syntactic deep structures of our sentences are; but we have seen that there is good reason to think the two may hs similar, and we can make more or lesa detailed conjectures about their form. In my exgosition I shall make various assumptions about aemantic representations, some of whiah have already been made for independent reasons by other scholars. Insofar as my theory depends on these assumptions, a refutation of the latter refutes the theory -- this is one of the respects in which my theory is falsifiable, i.e. scientific, null I shall present my theory in a *relajxLvely informal, intuitive Way to begin with, and formalize it more carefully later.  What we are looking for is a specifica-tAon of a set Erin  of states, which we can interpret as states of some subpart of the mind of an English-speaker, such that semantic represent\ations of English seqtences are rather natural devices for moving this part of the mind!. from one of its states to another. It will be convenient to have some name for that part of a human's total psychological make-up which is described by speoifging , In earlier, unpublished work I have called thia the topicon (coined on the analogy of 'lexicong), since I envisage it as coneaining a set of entities corresponding to the objects of which ita owner is aware, and to which ha can therefore take a definite description to refer.</Paragraph>
    <Paragraph position="30"> Ln, then, is to be a set of possible topicon-staeea. me set8 of topiconstates available to speakers of natural languagee other than English will differ from (f 17 below), but not in respect of the properties on chich thia paper will concentrate. Note that a topicon-state is certainly not to be equated with a %tate of mind' or tpsychological stateq: a topicon is claimed to be only one small part of a human's mental machinery, and there will be many way8 in which the latter can vary -e.g. the human may be happy or sad, asleep or awake -- without implying any difference in topicon-state.</Paragraph>
    <Paragraph position="31"> Just as an APEstate contains a set of APLobjects with properties drawn from a fixed clase, so a topicon-state will contain a set of ob~ects I shall call referents. '6 suppose some person - P knows of the existence of a red car - C; then P1s 111) topicon will include a referent - c corresponding to - C. Thq referent a c will be - P1s 'Idea1 of - C, in Gehchts terms (1957). The possible properties for referents will be determined by the vocabulary of - P1s language, in this case English: each lexical item of English will correspond to a referent-property.</Paragraph>
    <Paragraph position="32"> I shall use Geachls operator 5( ) (1957: 52) to form names of referent-properties from lexical items: if - P knows that L C is a red car then - c will have the properties SS(red) MhJ and SS(ear). MMI (An element of a mental state cannot be red, but itcan be Pfs - topicon will include not only referents representing physical obdects but referents for any entities of ~hich - P is aware an8 which he can take definite descriptions (referential NPs) to denote: there will be referents for characters in fiction, for abstractions like the centre of this circle etc. (Wlh--A-.'.A~' eto. But, at any given time, - P1s topicon will contain only a finiee number of referents. Given enough time, of course, *here is no limit to the number of objects whose existence P could deduce or imagine; and I shall suggest that for P to - '%ere and below, rather than coining neologisms I use terms having established uaages in philosophy and logic insenses which claeh with their normal use; in such~cases I use the term only in the sense I define.</Paragraph>
    <Paragraph position="33"> deduce or imagine the existence of some entity L B is for - P's topicon to acquire a new referent representing (I B. kt deduction and imagination take time: in a finite amount of time u Pas topicon will have acquired only finitely many referents.</Paragraph>
    <Paragraph position="34">  Consider sentences (2a) and (2b) addressed to - P (and let us simplify things initially by supposing that - P does not previously know of any red cars -- we shall consider the more general case in 510): (2a) bs a red car ya-- null I aold the red car today.</Paragraph>
    <Paragraph position="35">  ------The HP a red in (2a) will create a referent with the prorv- null gerties SS(red) - and $(oar) cIM*r in - Pas topicon. On the other hand, when he, hears (2b) the NP *he red car will pick out the refer--- null ent which a red car has already created (in order to act on it  --in ways which will be discussed later). In other words, the distinction between the ,Arvrr.MccH- red car and a red car is quite parallel  N-to the distinction between - dscr and - asst constituents, reapectively , the selects from the current state, the latter adds an object to the current state. Let us call natural-language expressions which act in the former way 'identifying expressionst (IEs), and expressions which act in the latter way aestabliahing expressionsa (323s). Clearly the IE/EE distinction is related to the traditional distinction between :definiteq and 'indefinitea We. However, I do not claim that-/all definite and indefinite MP8 count as IEs or EXs respectively. Consider, for instance, the -- de dicto / -- de re (or opaque referents / transparent reference) ambiguity exhibited by 9s in 'intensional contextst (see s.g. Quine 1960: $30;</Paragraph>
    <Paragraph position="37"> In the -- de dicto sense of - ah -, (3) does not imply that there are any elephante, although in the -- de re sense it implies that there ia at least one. Only in the -- de re sense is an elew- null hant in (3) an EE in my tams, though ptactically an e</Paragraph>
    <Paragraph position="39"> ant is clearly an 'indefinite NPt in both cases. On the other  w--hand, in sentences containing quantifiers, definise lBPB may not always be IEs: thus, in (4) the definite NP the does not denote a particular object (and therefore, perhaps, does not pick out a particular one of the hearer's referents). just as the indefinite NP cH. a does not seem to establish the existence of- a single book:  (4) Whenever I bs a book I remove the d-A/ N4---I shall develop my theory with a view to handling the sub-set of Englieh which excludes quantification and intensional oontexts, and in whioh 'definite' and 'indefinite1 NPs do coin null cide with IEa and EEs respectively-. Later I shall consider some of *he aspects of EnglLah which my theory does not handle successfully as it stands.</Paragraph>
    <Paragraph position="40"> men in the subset of Englisli considered here, not all definite NPa will in fact refer to referents already in the hearer's topicon. For instance, a chi= may come home from school at-the beginning of a new term and say to his mother: (5) I saw the new teacher  rr-EAM.--It may be that the mother does not know that there is a new teacher, i.e, it may be that her topicon contains no referent for which the phrase the new teacher is appropriate; but in  AMh-practice she is likely to work out from her child's sentence that there is a new teacher, and to understand that the child saw him. other words, this phrase the  teacher acts as an EE to create a new referent in the mother's  topicon. However, it seem8 plausible to say that this is in some sense not the central use of a phrase such as the new  wteacher; it-would be more appropriate, if the mother does not  how about the new teacher, for the child to say something like: (6) There is a new teacher at school and I saw him t~Eks.eb--lIMCl-MWItv-~ null in-which the referent denoting the teacher is first introduced by anaEE aad only then re-identified by an IE. Notice that  the mother may react to (5) by sayin6 something like: (7) What new teacher? I didn't know there was one. --- ff-v--JIMMV(, which would not be a poasible reaction to a sentence using the EE a new teacher.</Paragraph>
    <Paragraph position="41"> -- - null What happens when the mother successfully acquires a new referent-in response to (5), I suggest, is that she imagines some clrcurnscance in which the new teacher would succeed in A-wv-7 picking ref eren-t; her topicon -- for instance, if there were a new teacher at her child's school -- and, in imagining these circumstances, creates the *referent; after which the sentence operates on her topicon in the normal way. The APL system d~es not work like this: if one inputs the sentence A + B to an APL-state lacking an object named B, the system AhAmmrJ &amp; prints out a message pointing out one's error but does not change state. It is natural enough, though, that human linguistic behaviour shows more initiative than the behaviour of artificial automata. A programmer has complete control over the automaton he programs, and it is easier to require the programmer to get his programs right than to equip the automaton with routines to guess what the programmer means by defective sentences. A human speaker, on the other ,had, has no way of knowing exactly what state his hearer's topicon is in, so it is all to the good if the hearer can comp6ffgate in simple cases for defects in the speaker's sentences.</Paragraph>
    <Paragraph position="42"> Gince-I shall frequently be speaking of the relations between linguistic sqressions, topicon-referents, and th'e entities in the outside world which the linguistic expressions denots, let me lay down some tsrminolog&amp;cal conventions, I shall use denotation for the relation between an'IE and he thing which a hearer takes that IE to correspond to; my theory asserts that denotation i8 a composition of two relations, a relation of reference between linguistic expressions and topicon referents, and a relation of representation between topicon-referents and things. Thus, if the phrase your car said to P now  --- null picks out a ref erenti =, in - P's topicon, and if - P owns exactly  one car - C, then your car refers - to 2, and denotes - C, and s, represents - C. We say' that =, is the referent -4 of and - C the denotatum of, = car (on this occasion), hlCkV Notice that an IE may refer, without denoting: if - P has Crime Punishment, then his will two referents, say 3 and q, such-that refers to E~  and ~lana Ivanovna refers to r even though neither of these -- -3) NPs denotes anything (and, correspondingly, 3 and g, will both L...</Paragraph>
    <Paragraph position="43"> have the property S(f ictional) ) .</Paragraph>
    <Paragraph position="44"> Furthermore, identhy of the</Paragraph>
    <Paragraph position="46"> denotata of two IEs does not imply identity of their referents.</Paragraph>
    <Paragraph position="47"> Thus, if - P knows that I have exactly one brother and that he is the new doc%or, then the IEs the new doctor and  --brother will refer to the same referent in Pfs topicon, and  henoe also denote the same man (the details of reference by means of the genitive construction are discussed in 912 below); but if - P knows that I have one brother and that there is a new doctor, but does not realize that they are the same man, then the two IEs will refer to different referents in - Pgs topicon, even though each of these referents will in fact represent the (10. So far we have assumed that, when a hearer hears an IE auuh as the red car, his topicon contains only one referent  --~5th the properties $(red) - and $(car). H Clearly this will not in general (or even usually) be so: when one hears an IE, it will often be the case that one knows of a number of objects In fitting the description. 'l One who hears the red car will take --v the phrase to refer to one among the various red cars of which he is aware which is in some way closer than the others to the '7~f Russellg s theory of descriptions u us sell K19051 1949: 1051 Whitehead &amp; Russell 1927: 30) were an accurate semantic desclription of English (which Russell did not, of course, claim- it to be -- cf. his 119573 1969 : 335-7), then most English sentences uttered in practice would be simply false because they_contain IEs asserting the ungqueness of objects fitting descriptions which in fact are multiply instantiated. Philosophers who have discussed reference have treated it as a simple relation between expressions and things, rather than as the compoaite relation for which I argue; but they have succeeded in this only by devotiag undue attention to NPs, such as Socrates or  which perhaps have only one p-le to be quite rare in practice. null focus of his attention. This will translate iatm our theory as the notion that the referents in a topicon are arrayed in some kind of space, one point of which constitutes the focus of attention at any given time, The nature of this space, and the factors which determine the position of the referents and focus of attention in it,, will be considered in 916 below; for the moment, let us simply assume fhat the notion can be made precise.</Paragraph>
    <Paragraph position="48"> Then we can say that any IE consisting of the word the followed by a series x, ;1 ... w of adjectives and noun AEH* -n will refer to the nearest referent t5 the hearer% focus of attention having all the properties ( ) , () . . . , and 5%) Thus, the car will refer to the nearest $(car) referent hM/- to %he focus, while the red car will refer to the nearest refer--- null ent to the hearer's focus which is both SS(red) and %car),  IWw AM/ One would expect that the nearest referent of all to the focus in any topicon-state should be referred to as the* in English  a eyntactic rule replaces the as a complete NP by he, she, or</Paragraph>
    <Paragraph position="50"> ad, In &amp;PL, object8 :an be referred to by their identifiers.</Paragraph>
    <Paragraph position="51"> The oUvious candidates as natural-language equivalents of identifiers are proper names. However, although some logicians have di~cussed proper names, under the label sinmlar - terms, as if 181 can offer no explanation of the syntactic distinction between nouns and adjectives, which serves no obvious semantic mction; however, since the distinction appears to be universal in natural laaguagGs, my account of English semantiq representations will incorporate it.</Paragraph>
    <Paragraph position="52"> (We solution to this puzzle ray have to do with the fact that Borne adjectives are 'syncategorematic' in a way which nouns never are: a 'goad actor' is not necessarily good though he is necessarily an actor.) tney are the equivalent of APL identifiers (for a summary of the alternative views, see Oheng 1968), English proper names in fact do not behave in this way. In APL, a state in which two distinct objects bear the same identifier is simply not a well-formed state. In English, on the other hand, locutions</Paragraph>
    <Paragraph position="54"> occur frequently enough: although many proper nouns apply only to one referent in an average topicon, many apply to more than one. Superficially, proper nouns seem syntactically distinct from common nouns in that IEs containing proper nouns lack - the: the car, but not *the London. However, Gloat (1969) has argued WIICChS. -convincingly that in deep syntactic structure proper nouns are preceSded by the, and that proper and common nouns are syntact- null ically quite parallel in the base. We shall take it that proper nouns correspopd to properties for referents in just the same way as commol nouns: London refers to the nearest /Hrycrcrch, SS(London) referent to the hearer's fo,cus, as the car refers - Mh/\ML to the nearest SS(car) ~mh referent. (me problems of how the pairs of IEs in (8) and (9) succeed in referring to distinct referents will be answered in 912 and $15 respectively.) Clearly there is a disttnction between names and common nouns in that the applicability of a name to an object is more 'arbitraryt than that of a common noun. But this distinction is gradient rather than all-or-none; e.g. a schoolboyts nickname, such as Fm, will be intermediate in arbitrariness (a boy called Ps will probably be fat, but not all fat boys will be called 12. A somewhat more complicated situation arises in connection with IEs involving genitive constructions. The 'basic' sense of the genitive is commonly taken to be possession, as in John's car; however, the genitive often represeats other</Paragraph>
    <Paragraph position="56"> the of the problem, the dm of the U&amp;u&amp; etc. etc. w #v--- - /ur-Even in a case where the genitive NP denotes a person and the head NP denotes an inanimate ouect, such as -- John1a car, although on many occasions of use the NP will be paraphrasable as - the bar which John owns the same NP will surely be used equally --.--wP--* frequently in other situati-fins in which the appropriate paraphrases would be the car which we saw ni miss -w-(Wdv-v John down, the car which John km he'd like to bx if  hwu--- --- Lr+rc he on&amp; had the -, or other expressions of urely idiosynw -A&amp;-'cratio and epheieral relationships between the denotatum of the geniiive HP and that of the head NP,  The device of the topicon space permits a neat account of this situation.</Paragraph>
    <Paragraph position="57"> In an NP of the form A's B or the B of A (e.g. -w - -I.--John's car, the roof of the house), A will as usual pick out -- --&amp;-bat---- II the referent the hearer ' s focus having the properties corresponding to the lexical items of r, n while the NP as a whole will pick out the nearest referent - to &amp; having the properties corresponding to the lexical items of B. - Thus in the case of John's car, John will pick out the nearest refer-</Paragraph>
    <Paragraph position="59"> ent to the hearer's focus having the property $(John) and, if  that referent is gl, John's car will pick out the nearest refer-- null ent to 2, having the property SS(car).</Paragraph>
    <Paragraph position="60"> The latter referent need w not be the nearest SS(car) referent to the he arerls focus; if it (\rur is not, the car and John's car will have different reference rchmr- -for him. As we shall see when we discuss the organization of the topicon-space in 916, ownership is only oneof the factors  that may cause a SS(man) referent to be close to a particular  #(car) referent in a topicon.</Paragraph>
    <Paragraph position="61">  13. Certain English words, known as deictics or tokenreflexives., correspond to the terms labelled - deic in APL: these include I, m, now, here, etc.</Paragraph>
    <Paragraph position="62"> hl - l9 Deictics, like other  IEs, pick out referents of the hearer's topicon; but their referents depend on characteristics of the spe'ech act in which they are used, and are independent of the arrangement or properties of referents in the hearer's topicon. For this reason, deictics neve? occur as the head of a genitive construction, and there are no phrases like %he you (with 's</Paragraph>
    <Paragraph position="64"> as genitive rather than short for is); will refer to the  same referent on a given occasion (namely the referent representing the addressee of the speech act -- the owner the topicon, unless he is overhearing words addressed to someone else) whatever other referents are in the vicinity, so it would be otiose to modify a deictic with a genitive NP.</Paragraph>
    <Paragraph position="65"> 34. So far I have discussed only referents corresponding to ndun-phraaes in syntax, and representing individuals in the outside world. However, some referents will represent what would more normally be called 'facts' or 'events' than 'individuals'. Ordinary predicate logic distinguishes sharply between individuals on the one hand, and facts or events on the other: the former are translated into singular terms, the latter into 'g~inguiats do not usually include the first and second person pronouns among the 'deictics', but logically they are of the same category.</Paragraph>
    <Paragraph position="66"> arrays of predicate followed by arguments, and the syntax of the predicate calculus does not permit one to occur in place of the other. However, in English, if e.g. John - b-t -- the car has the semantic representation 'f(a, -.LI b)' (where - f is the predicate bs and - a and a b are singular terms standing for John and $he car), then presumably the semantic representation of:  (10) It surprised Mary that John the car.</Paragraph>
    <Paragraph position="67"> -uw--- wwill have to have 'f(a, 3- - b)' as one of the arguments of the predicate su rise -- (10) will have to be represented as some-</Paragraph>
    <Paragraph position="69"> thing like 'g( f(a, 3L - b), ), where g is and c stands for Mary (tense is discussed later in this section). 20- If ''liosenbaum (1967) has shown that in deep syntax, before the application of a transformation called 'Extraposition', (10) has the normal subject-verb-object structure with that  ab- the car - as subject. The need to permit proposltions as arguments of predicates is discussed by Leech (1969: 25-6). In the *selnan$ic representations' given here, I arrange the predicate-symbol to the left of all the arguments, in order to clarify the c~mparison with standard logical notation. It is by a quite arbitrary choice, however, that formal logic writes 'f(a, 2)' rather than 'a f b', and when I define &amp; Qelow I azll adopt the orderiiig-which more closely Eng reflects the surface stzmcture of English. (Ct is a moot point within linguistics whether the dee structures of English sentences have 'the ordering subject-ver _% -obJect or verb-subjectobject.) null  facts, as well as things, may be denoted by suitable linguistic expressions, then suppose that topicon contains referents representing facts (propositional referents) as well as referents representing things (individual referents). We will suppose further that the referents in a topicon are linked in a graph structure in which propositional referents dominate n-tuples of (propositional or individual) referents, corresp- null onding to the arguments of the respective propositions. Consider e.g. one who knows that someone called John bought a car: his topicon will contain a structure of the following form: In (11) nodes stand for referents, which I shall call 'r '</Paragraph>
    <Paragraph position="71"> states, as opposed to 'dl - for nodes of phrase-markers of sentences). The lowest-level referents are unlabelled, while the higher-level referents are each labelled with an English word.</Paragraph>
    <Paragraph position="72"> 'Phe referents =, and g2 represent respectively John and the car, while represents the fact that John bought the car. The referent r represents the fact that the thing represented by  is a car, while r represents the fact that the thing repre- null sented by 2, is a John ('is called &amp;quot;Johnu1. as we usually say in the case of proper names).</Paragraph>
    <Paragraph position="73"> To say that a referent, say I&amp;, has the property SS(car), - is to say that there is some referent , in this case r which dominates the I-tuple r,-+ and which is -5' labelled car.</Paragraph>
    <Paragraph position="74"> rcNvc A sentence acts as an EE for the establishing of propositional referents, as an indefinite NP such as a car is an EE  hAfor establishing individual referents. 5us, suppose - P1s topicon contains the structure of (I?), together with a 9( referent, say (that is, is dominated by a propositional referent z7 labelled Mary): then P's hearing (or reading) the</Paragraph>
    <Paragraph position="76"> topicon a new propositional referent, say %* labelled and dominating the 2-tuple (3, ): - P1s new topicon-state will contain the structure shown as (12) on the next page.</Paragraph>
    <Paragraph position="77"> In (12), broken lines show the new structure created by sentence (10). Notice that propositional referents, like individual referents, may be referred to by pronouns; if r+ is close enough to - Pgs focus of attention, the same effect will be achieved by the sentence: '(13) It surprised Mary.</Paragraph>
    <Paragraph position="78">  *--The number of referents dominated by a given referent in a topicon will corrdlate with the label of the latter referent. An unlabelled referent will be an individual referent and will dominate nothing; a referent labelled with an - n-adic predicate will dominate an I n-tuple of referents. 5us a referent labelled with a noun will dominate one referent; a referent labelled with a verb taking subject, direct object, and indirect obaect will dominate a 3-tuple of refiirents; and so on. In natural languages the distinctions between the different arguments of a verb are shown sometimes by ordering, sometimes by prepositions (to /HS or case endings (Johami), etc.</Paragraph>
    <Paragraph position="79">  -I assume that some individual referents represent points of time, and that one of the arguments of most verbs in natural languages is the time at which the action in question occurred. mere a verb in the preterite occurs with no phrase overtly denoting a point of time, I take it that the nearest time referent t;o the hearer's current focus of attention becomes the respective argument of the new propositional referent: one would not normally say e.g. - John bat the mW.-- car unless the hearer can be expected to know what occasion one is speaking about. In other words, preterite tense picks out the nearest SS(time) referent as CHh he picks out the nearest SS(male) - referent. McCawley (1971) has argued that preterite tense and pronouns have a cornon syntactic origin, a finding which renders my</Paragraph>
    <Paragraph position="81"> Although I assume time arguments for verbs, to avoid clutter I shall not include them on diagrams.</Paragraph>
    <Paragraph position="82"> That - clauses may be used to refer to propositional referents either as IEs or as EEs, without the distinction being markeii syntactically. Sentence (10) (It - md that John b-t the car) is equally appropriate whether or not w wthe hearer already knows that John bought the car. !Phus, if P's topicon contains the structure of (17) (p. 35), the phrase   that John b- the car in (10) will pick out and create -- UlWC null the extra structure of (12); but if - P's topicon lacks r+, then the same phrase will create a referent labelled- ba and dominating - rq (and a time referent) before the rest of the L'I A verb in the perfect, as in John has bou ht the car,</Paragraph>
    <Paragraph position="84"> will act as an EE for a time referent , as a ver 1n preter2 ite acts as an IE. These remarks might, however, have to be modiiiea to handle American usage: one of the characteristics of American English is that it permits the preterite in circumetances where the perfect would be obligatory -- in British usage. sentence creates a node labelled su_lprise dominating this new node, , and a time referent. The absence of syntactic di9tinction between phrases establishing propositional referents ad phraseril identifying them can readily be explained. Either John bought the car at the time in question or he did not; there will never be two referents both labelled bs and dominating the same 3-tuple of individual referents, so if - P's topicon contains (11) and he hears the phrase that John  the car then he knows that this must refer to r+ rather than  Ccrru&amp;quot;calling for the creation of a new propositional referent.  If there were no distinction between the car and a car, on the NHmA#@-.. (vrother hand, he would have no way of knowing, on hearing the IP car, whether 9 or some new SS(car) referent was intended. ow--&amp; -I take it that. languages lacking definite and indefinite articles mark the IE/EE distinction for individual referents by some other syntactic devices.</Paragraph>
    <Paragraph position="85"> 15.</Paragraph>
    <Paragraph position="86"> The graph structure into which an individual referent enters can be used to pick out that referent by mebs of relative clauses. Thus if the car refers to g2, then the IE:  It is convenient to speak of a topicon's owner as 'knowing' facts about his topicon, just as it is convenient to anthropomorphize a computer program and speak of it 'knowingt various facts; these locutions are, of course, literally nonsense, but they could easily be replaced by longer paraphrases which did not commit category mistakes.</Paragraph>
    <Paragraph position="87"> referent labelled dominates a 3-tuple including - r,, 2, and the nearest time referent.</Paragraph>
    <Paragraph position="88"> If - P knotvs that the denotzbam of 4 r is a man (i.8. if his topicon includes, in addition to the structure diagrammed in (12) (p. 37), a referent labelled - man and dominating g,), and if there are no tense problems, then (14), the man who the car will refer to g,, e-- &amp;--- -3 However, note the distinction between restrictive and appositive relative clauses (cf. Bach 1968), A restrictive relative clause, e. g, who - bm Ee czr in (I4), is part of an IE: it gives a property of the target referent. An appositive relative clause, on the other hand, as in ('l5), acts as an EE: 'Phe man, who b- the car is old.</Paragraph>
    <Paragraph position="89"> --- ---In (15), the man acts as a complete IE; when (15) is input to  wa topicon, the man will pick out the nearest SS(man) referent w- h*Hrr to the focus (say &amp;), and then the mgositive relative will create a new referezt labelled buy and dominating &amp; and the  referent of the car, before the main clause creates-a referent labelled old dominating G. The function of appositive relathmc null ive clauses in natural lGguages is thus quite comparable to that of embedded - ass* clauses in APL, 16. The principle that each sentence received by a hearer creates a new referent in the hearer's topicon suggests a natural way of reconstructing within the theory the notion of a focus of attention9, which varies with the topics being discussed: we may define the focus of attention as the most recently-created referent at any given time. The graph structure associated with propositional referents offers a way of formalizing the notion of distance between referents in the topicon: we may define the distance between any two referents as the minimum number of edges (e Unes which link nodes) that must be traversed t&amp;quot;o get from one referent to the other. Thus, consider the sequence of sentences: (ihJ John - a car.</Paragraph>
    <Paragraph position="91"> the hearer's topicon already containe $ referent, say g,, with the properties S(~0b.n) - and SS(man). Amr hearing sentences (i) w and (ii) but before (iii) the hearer's topicon will include the structure of (16), with the focus at 5 (the referent created by (ii)): (16) contains two referents to which he - could refer, namely g,, apd I+; r+ is one edge from the focus and ;, is three edges away.</Paragraph>
    <Paragraph position="92"> Therefore the theory predicts that he in (iii) will be  taken to refer to r+ rather than y,, and this prediction seems correct: he in (iii) will be taken to denote the man who has w been hit, rather than John. (Notice that this cannot be predicted from the situation described: when a driver hits a pedestrian, the driver is as likely as the pedestrian to call the police. )  We may no* define the automaton which I claim to represent the mind of a speaker of English. !be grammar of the sub-set of English we are analysing is as follows:</Paragraph>
    <Paragraph position="94"> The finite set of predicates of English, together with the phonetic shapes of particles such as the and .of, will be spec-</Paragraph>
  </Section>
  <Section position="4" start_page="90" end_page="90" type="metho">
    <SectionTitle>
- AAP
</SectionTitle>
    <Paragraph position="0"> ific to the English language. I would hypothesize that in other respects (17) generates the semantic representations of sentences in any natural language, though the rules which relate the phrase-markers generated by (17) to the corresponding surface forms will vary from languagsto language.</Paragraph>
    <Paragraph position="1"> Some of the latter rules which operate in English will be obvious. Thus, subordinate clauses (non-root 'Sf constituents) have that prefixed to them (replacing the in case the latter - w appears); nouns not preceded by the - are pupplied with -- a/=; 'the NP of IEf may become 'IE's HE&amp;quot; in some cases; wh is real-</Paragraph>
    <Paragraph position="3"> as she - in certain circwstances; adjectives have the verb be supplied, or are moved in front of their noun with the relative pronoun wh deleted; clauses outside an LE are given commas to IvlN mark them as appositive rather than restrictive relative clauses; etc. Is shall not attempt to render explicit every detail of the relation between my semantic representations and superficial structures of English sentences.</Paragraph>
    <Paragraph position="4"> We may define the set 9 of states of the automaton</Paragraph>
    <Paragraph position="6"> as follows. Suppose Pred is the finite set of English 7 2 predicates, i.e. the, set - Pr u - Pr u . in (77). Then a pair</Paragraph>
    <Paragraph position="8"> 23~ ,treat the distinction between he and s e as determined rather than as needing to be marked in %e seman + ic representation: in the standard use of English pronouns (leaving out of account the special rules operating under contrastive stress), &amp; is appropriate only if the intended referent is the nearest individual referent of all to the hea.rerfs focus, not merely the nearest of the #(male) referents.</Paragraph>
    <Paragraph position="9">  24~ use semiforest as a generalization of the notion - tree: each node immediately dominating a length-n - string is labelled vith an 1 n-adic predicate and each leaf is unlabelled, and  which determines how a sentence of this moves a topicon from one state to another is specified by rules which associate subsets of the referents of the current topicon state with nodes in the structural description of the sentence; as in the APL case, certain nodes cause additions to the current state. We shall write 'Ref' - for the partfal function, specified by these rules, from nodes of the sentence into subsets of the topicon-referents; in the case where a conetituent refers (in our technical sense) to a ~eferent, fef will take the node dominating the constituent into the unit  aef aontaining that referent.</Paragraph>
    <Paragraph position="10"> The rules determining Inthg are as follows: (see aext page) a seniforest is allowed to have more than om rout, and nodes #we allowed to branch upwards as well as downwards. A semiforest over a vocabulary V-is a triple (D, 6, #) where D is a set st oodea, 8 is a part-izl function of immediate domin'ce iron - D Sntxings over D such that every node is dodnated (not necessarily immedigtely -- dominate is the ancestral of 'imedistely dominate ' ) by at least one root (i. e. undominated node), anQ oc is a partial function of lamin from D into V. Nodes outside the domain of 6 are leaves --Y-+ or ermina'l: noded: Note teat, b$ defining the range '6s containing strings, I have built left-to-right ordering into my aefinition; semiforests as defined here are ~stringsemiforests.~ rather than 'setsemiforests' in the sense of Sampeon (forthcoming).</Paragraph>
    <Paragraph position="11"> (R?) Whenever two nodes &amp;, - d' of the phrase-marker are such that - d immediately dominates the length-I string -3 dt if -- Ref(dl) is defined then -- ~ef(d) r Ref(dR). -- null 25~trictly, R2 should read: *If a node d labelled IE imrPeUiately dominates the length-2 string db 8&amp;quot; in which dR is labelled th . (etc.)': the abbreviatio=ed &amp;re shzld be self-erp 3 anatory. (Cf. also the prime on 'IE1 in rule R3, used to distinguish two nodes each labelled IE.)  (i) if is the left sister of S and some NPi (1 - i &lt; - n) is realized as wh, w then - Ref(NP.) 1 = - R~~(NP~)T (ii) if there is a referent la6elled A# P and immediately dominating g, . . . r where irql = Ref (NP,) , = -31' -- Ref(m2), ... , and = R~T(w n ) , then - Ref (s) = 1%) ; (iii) if no zuch referefit as in (ii) exists, then it is created, and - ~ef (S) = {+3 .</Paragraph>
    <Paragraph position="12">  you love the teacher who bought a horse.</Paragraph>
    <Paragraph position="13">  +-------Sentence (18) has the foil-owing semantic representation (as usual I ignore tense for simplicity). I omit the superscripts from 'Pr' nodes, since they are obvious, and I subscript certain nodes for later discussion.</Paragraph>
    <Paragraph position="14"> The topicon state to which (19) is input is assumed to be as in (20) below (without the material drawn in dotted lines), with the current focus at r (indicated by concentric circles):  The- owner of the topicon diagrammed in (20), to whom (19) is addressed, is represented in (20) by g5: he is a man called Dick who has caught and eaten a fish, and who loves the denotatum of s7, who is a woman teacher who has bought a horse. The denotatum of g2 is a man called John who has also bought a horse and has eaten a fish which was caught by the denotatum 'of El,, a man teacher called Tom, who loves the same woman as Dick.</Paragraph>
    <Paragraph position="15"> We now use mles Rl-R10 to interpret the nodes of (?9), beginning with the leftmost interpretable leaf (since the material on the left of (19) is what is heard first). Nounl is dominated by IE, so by R9 - Ref(Noun,,) = iz,, , g53; hence by R1 - ~ef (NP2) is also jz1, z2, Similarly - ~ef (9) is Er2\, so, tripially, by R2 Ref(IE4) - is {r2'). . R~~(NP~) - is</Paragraph>
    <Paragraph position="17"> eight edges from - r2, - Ref (IE6) is fr3] - (John1 -- s fish denotes the fish that John ate, on this omasion). By R7, ~ef (S7) = is,) (only Tom caught the denotatum of 3); so, by R8, - R~~(NP~)  senting the addressee, so - Ref(NPq3&gt; = .</Paragraph>
    <Paragraph position="18"> - ~ef (NPq4) = {q, g7]- lief (qs) = is, +I; both cq8 ~9 dominate pairs of referents the second member of which belongs to _I R~~(NP?~), so,</Paragraph>
    <Paragraph position="20"> and the new focus is at r (9 45 9 -3'l Notice that8\, were it not for the appositive clause who  was bald, the phrase that you loved the yy teacher in (18) -- ---- would be redundant. !he initial focus was at r (the previ- null ous sentence had been You love the ym teacher, say); the w-- P sentence !Che man who cst John's fish knew it would serve as -vb--- -7-well as The man who cm John's fish knew that =love the --- ---- --teacher or ... that you love her to create the referent - NH.-ymr.- null However, the appositive clause who was bald shifts the  i.e. to denote the Q fact of his baldness rather than that oS Dick's loving the teacher. Intuitively this prediction seems correct. The relation Suc -Eng ' which determines which possible next statis Jmg can move to from any given state independently of input, will correspond to the rules of inference in the semantic description of English. Thus, suppose there is a rule 'x fish &amp; g; catch x -+ x die a in English (i. e. suppose it is -- -- -part of the meMng of the words fish, die, and catch that a - - fish dies if it is caught); then the topicon of (20) will be liable at any time to acquire a referent labelled die and dom/\hN. null hating r or , since each of these have the property #(fish)  and occur as second argument of a referent labelled em. Clearly, Lg will be a non-deterministic automaton: the single rulerof inference mentioned permits two alternative successor states for (20). Anydne with experience of constructing deductions in formal logic bows that there are typically a large (though finite) number of ways of continuing a given derivation; similarly, the rules of inference for a natural language will no doubt permit many.posaible successor-states for any given state.</Paragraph>
    <Paragraph position="21"> If the process of moving throue states under the control of the successor-stale relation is .to be the reconstruction within the topicon theory of the pretheoretical notion of thinkinq, this characteristic seems desirable: we do not feel that human thought flows along deterministic channels.</Paragraph>
    <Paragraph position="22">  18. Although the effects of most changes of state in the cases of the machine-language discussed in 82 and of APL were confined to the automata themselves, in both cases certain state-changes were associated with action by the automaton on its environment. Thus, whenever an APL-state acquired an object named -3 a a representation of the property of that object was, printed by the system on an output sheet of paper. We may imagine that action is linked to thought in this way also in the human case. Suppose some referent in a topicon represents the person who owns that topicon; then it might be that whenever, during a sequence of state-changes controlled by the successor-state relation, the topicon acquires a referent labelled assert and dominating % in subject position and  some proposi*ional referent E, in object position, the owner of the topicon utters a sentence which asserts the proposition represented by zq. And, supposing 9 represents some person, say John, if a hit referent is created dominating ( z2) then  26~ do not intend this paragraph to imply any position on the determinism/fsee-will issue. If determinism is correct, then there will presumably be laws deciding which out of the various successor states permitted by the rules of inference of its language a given topicon actually moves into at a given time. Such laws lie outside the scope of this article. the topicon-owner hits John.</Paragraph>
    <Paragraph position="23">  19. There are two obvious problems Connected with the notion that the referents in a topicon, which are supposed to correspond to the entities of which the topicon-owner is aware and the propositions he believes, are created by input sentences. The first problem is that no allowance is made for the possibility that speakers are not believed. Thus, if the topicon-owner hears John, the denotatum of s2, say I bat a car yesttd. IU- erdag, then according to the rules I have laid down his topicon acquires a SS(oar) ~rhk referent representing John's new car. But in practice, obviously the topicon-owner may choose to disbelieve John; what happens to his topicon in this case? The second problem is that it is simply untrue that a person acquires beliefs about the existence of entities and the truth of propositions only by being told about them, I may come to believe that there exists a red car either because John tells me that he has bought a red car and I believe him, or because I see the red car; similarly, I may come to believe that John bought the red car either because he tells me so or because I watched the transaction take place. The car may subsequently be denoted by the phrase the red car, and the prow-- null position about it by the clause that John bought the red car,  hcum-Ncrk*Nc-~irrespective of whether the referents rewt.seht~~ the car and the proposition were ere-ated in response to speech or observation, 27~hesa remarks may sound as if I an! treating humans as mindless robots -- 'automata1 in the pejorative, deterministic sense -- but quite the reverse: remember that the referents whose creation correlates with the topicon-owner's actions are brought into being by the process we have identified with thinking. There is nothing disrespectful to our species in suggesting that our actiens are controlled by our thought. The answers to these problems are related.</Paragraph>
    <Paragraph position="24"> I suggest that the sight of John buying a car ia the kind of input to a person that ha8 the effect on his topicon which I have so far attributed to the hearing of the sentence John b-t a car (or - #---John b=t the car if the car is one of which the topicon - - -' owner is already'aware): in other words, this sight creates a referent labelled buy and dominating referents representing  John and the car. On the other hand, hearing, say, Mary uttering %he words John - bmt -- a car, or hearing John say I b-t  #va car, has a more complex effect than I have been suggesting:  c.Lit creates a node labelled assert dominating the referent  representing Mary (or John) together with a new buy referent   as already mentioned.</Paragraph>
    <Paragraph position="25"> I diagram the two cases in (21) and (22), on the next page. The part of the diagram in solid lines is the same in each case, and represents part of the hearer's topicon before the change of state. In (21) the dotted lines represent the effect on the topica of aeeing John buy a car; in this case, since the topicon owner sees the car, re may assume that he adds some further facts about it (such as that it is red) to his topicon. In (22) the dotted lines show the result instead of hearing John say I bst a car. In this case, the referent represent-</Paragraph>
  </Section>
  <Section position="5" start_page="90" end_page="90" type="metho">
    <SectionTitle>
- NW
</SectionTitle>
    <Paragraph position="0"> ing the car will be dominated just by the car node and the  node, since the hearer has no independent information about it. 2%oss (19'70) and others have claimed that there is actually syntactic evidence that JJ bm %sr has a deep structure something like I sert tfi John ou ht agar. Ross's + arguments are attacEed by raser WO beyson (1970), Batthews (1972). My theory is intended to be independent of Ross's claim, although the latter, if accepted, would possibly make my theory seem more plausible, Notice that, if John tells you he has b~ught a car, you may well doubt that he has bought a car but you are not free in the same way to doubt that John has asserted that he has bought a car. You may, of course, doubt the latter also -'Did he keally say the words I thought I heard him say?', 'Can I be sure it was really John speaking?' -- but this is to doubt the accuracy of one's observations, as one may doubt whether John bought a car after watching him buy it, rather than doubting the truth of what is said to one.</Paragraph>
    <Paragraph position="1"> Clearly there are enormous problems about how observations via the senses of a complex and continuous environment result in topicon changes corres@onding to the input of a discrete sentence: why should my view of John handing over a cheque on the car-dealer's forecourt change my topicon in the way which corresponds to the sentence John bat a car, rather - -than any of the (surely) infinitely many other propositions which could be corroborated on the evidence of my current visual, auditory, etc. inputs? However, these problems are in no sense created by the topicon theory: these are already familiar problems in psychology and in the philosophy of science. (Cf. e.g. Hanson 1948, Gregory 1970.) Some process of deriving discrete propositional beliefs from continuous sensory input must occur, if observation is to be relevant to propositional knowledge at all. Since this process is known to exist independently of my theory, and since I can make no contribution to maerstanding is, I shall not consider it further.</Paragraph>
    <Paragraph position="2"> Once we agree to treat simple declarative sentences as creating propositional referents labelled assert, there is no  special diMiculty in handling sentences performing other illocrltionary acts; e.g. - Shut - the door! - will establish a command P referent dominating referents representing speaker, hearer, and the proposition that the hearer will shut the door.</Paragraph>
    <Paragraph position="3"> Rules of inference may permit referents representing facts about the world to be created on the basis of referents representing facts about assertions. Suppose, for instance, that there is a rule of inference which we might state as 'x - a-t ;e, -- x truthful -+ ;E true'; - then a topicon including referents representing the fact that John is truthful and me fact that John asserted that he bought a car will be able to move to a state in which the representation bf the proposition asserted by John is &amp;true), - as shown: Similarly, one can imagine thaf there might be rules of inference taking a topicon from the state created by the reception of Skt the door! to a state which causes the topicon-owner to  --shut the door. However, here we come close to the point at which my theory in its present state breaks down; I defer aiscussing this 20. According to the theom I have sketched, English as gram@ language is not dissimilar to DL, SNOBOL, etc, resembles the latter in that its states consist of arrays ~bjects &amp;awn from a specified class (although the precis structure of the arrays is different as between English a the artificial programming languages, as it ie between the latter themselves), and io. that the structural descriptions of Its sentences include a subclass of nodes which pick out objects from the current state and another rsubclass which add new objects to the current state. English differs from &amp;PL, SNO-BO&amp;, etc., in lacking identifiers, and in using the property of distance between objects in a state in order to identify objects.</Paragraph>
    <Paragraph position="4"> My theory is certainly inadequate to account for many quite elementary facts about English and other natural languages. If may be that its deficiencies are too great for the theory to merit consideration.</Paragraph>
    <Paragraph position="5"> However, I would argue that it is rlorth according my theorg the temporary immunity from falsifbation to which Lakatos (1970: 179) suggests new =,search propammes are entitled, in Gaae anyone can suggest modificti*Xons wfiIcB preserve its-ga&amp;goints while removing its  Before discuseing the objections to it, let me mention a number of points to which my theory offers satisfactory soltatioas.</Paragraph>
    <Paragraph position="6"> In the first place, the theory is attractive simply because it offers an answer (emn itthe answer eventually turns out to be wrong) to the question why humans should spend so much of' their time exchanging the abstract structures called 'sentences': unlike cultivating the ground or building houses, the utility of this occupation is not immediately apparent to the observer (Sampsan 1972a, 1975~ 133-6). In my theory, the exchange of sentences, like direct, observation 'of the environment, helps humans build up a complex but finite 'map-' Or 'model' of the world, a model whtch can be described in quite * concrete terms and which controls the human's actions in mys which, again, in prindiple should be quite explicitly definable. null The notion tmodel' is of course g- central one in the most influential current view of what language is for -- the view which explicates natural-language semantics in terms of 'model theory'. BU~ the 'possible worlds ' of model theory, unlike the topicons of my theory, are infinitely complex entities which can hardxy be taken to represent characteristics of finite human minds. Burthemore, in the model-theoretic approach to nratural language, the point about a true sentence is that it aenotee the Bregean truth-value True (see e.g. wppes 1973); but if we think of the act of uttering a true sentence as the act of aenating the Tme, then it is quite unclear why people sBou?d utter sentences (let alone why tneg should utter one true sentence rather than another). 29 [~ootnote on p.59a] In my theory, to utter a particular true sentence to a hearer is to .make a particular change to his mental model of the world which gives the hearer more premisses from which to predict the consequences of his actions; thus, the more true sentences a person hears, the more rational his actions can be.</Paragraph>
    <Paragraph position="7"> theory has some more specific points in its favour.</Paragraph>
    <Paragraph position="8"> It explicates neatly some syntactic/semantic dist'inctions which seem rather pervasive in natural language bdt which have resisted other attempts at explication: the definite/indePinite distinction in noun-phrases, the restrictive/appo&amp;ive distinction in relative cxaflses. Also it neatly explains the genitive oonstruction. Acoounts of the genitive which treat it in terms of possessioh (e.g. Suppes 1973: 382-3) are simply Waithful to the facts; it seems that any relation between the denotatum of the head Ell? and that of the genitive NP in a genitive phrase can be used to upderatand such a phrase, but this makes sense only if, for a given hearer, there are a well-defined, limited set of relations between denotata -- as qoy theory asserts. My theory .shows how it can be that definite descriptions succeed in referring even thougb,, contrary to Russell's theory of descriptions, the properties they mention are typically not iniquely instantiated -- and, more remarkably, in the case of pronouns no properties of the denotatum are. specifies at all. My theory ie also satisfying in its treatment of presuppoeitione. null Although the fact that sentences typically embody preeuppoektions has by now received much discussion in linguistice, it has not been clear how the distinction between aasert2%or objections to model theory as a means of explicating natural-1-a e semantics and pragmatics, cf. Sampson (1974, 1975b), Patts f 4975), Jaraing &amp; Jardine (1975).</Paragraph>
    <Paragraph position="9"> ions and presuppositions should be represented in terms of syntactic or semantic descriptions. One proposal (cf. Fillmore 1969, .&amp;&amp;off 1969, Horn 1970) is that the semantic description of a sentence should be a pair of objects, one element representingX9qe proposition asserted and the other the proposition presupposed. This proposal is problematical, first because it seems arbitrary - why should a sgmantic description of a sentence consist of a pair of propositions rather than one proposition or a 5-tuple of propositions? -- and, more seriously, because it is not clear that there is in general dust one or even any fixed number of propositions presupposed by a sentence, as there is just one proposition asserted by a sentence. Thus, the sentence: (24) The car which John bought ie red.</Paragraph>
    <Paragraph position="10">  ---rhMCCrCr-chM.presupposes that John b- a car, but also presumably that</Paragraph>
    <Paragraph position="12"> there ie someone called John; John's car perhaps presupposes -w----- -that JohD bought a car, but perhaps alludes to the fact that John was almost mm down by a car, etc. etc. On my theory, failure of presupposition occurs when the input sentence is undefined for the current topicon-state. (24) will fail if there is no triple z,, , r-+ of referents in the current state such that % ie (S) , is SS(car), vCMk z3 is labelled s, and  immediately dominates ( 2). To say that (24) presupposes that John bet a ear corresponds to the fact that if -- ckthe latter phrase does not pick out any current referent; by the rules R4RIO which define the function Xnt -Eng ' then $he sentence (24)l Fail t-o create a node labelled assert - -- i.e. will fail to make an assertion. Presupposition-failure is quite &amp;;in to the case in APL when a - dscr node is realized as an identifier belonging to no current object, or as a function together with a set of arguments falling outs'ide the domain of that function; in the APL case, higher - asst nodes will fail to create corresponding APL-objects, as the sentence (24) fails to create either a referent labelled red or one labelled assert nMh in a topicon lacking zqe3.</Paragraph>
    <Paragraph position="13"> I have argued elsewhere (l972b) that the reason why the Liar paradox does not render English inconsistent is that, as a matter of observable fact, a definite description in a natural language ia never taken by naive native speakers to refer to a propoeition asserted by the sentence in whic*h that defihite deeoription occurs, whether or not paradox would result if it were. 'Phi@ immediately raises the question why natural languagee should have such a convenient property. My theory explains this simply: in natural languages, as in APL, interpretation of nodee takes place not simultaneously but sequentially, from the boetom upwards. At the time the referent of the NP what I  -am now ea is to be locafed, the referent ta be cr-aated by  wthe, eentenas cwhat I am now a is false cannot yet have wwHw- -been brought into existence, so the possibility that the two letgh'b be identiaal does not arise.</Paragraph>
    <Paragraph position="14"> %!he theory aleo explaine the puzzling fact that  may assume that a rule of inference of English states that anything is identical to itself, i.e. that a referent labelled identical may always be created dominating (r r ) for any - -x' --x referent - rx. Therefore the input sentence acEievZs nothing that the s~ccessor-state relation could not have achieved independently of any input. In the case of (25), however, if the hearer does not know that Scott is the author of 'Waverleyf, then the two NPs will pick out different referents r,,, 3 in his topicon and will create an iaentical node dominating  clearly'no Ehglish rule of inference will do this.</Paragraph>
    <Paragraph position="16"> The composite nature of the denotation relation incorporated within my theor;tr copes neatly with the fact that natural languages use exactly the same syntactic devices for discussing characters in fiction, and the like, as for discussing real entities. Anyone who has read Crime - and Punishment will understand the sentences:  and will agree that the former is true and the latter false. 30~trictly speaking, (25). will create an assert node ,dominating the ref @&gt;pent representing the utterer m) and the tical --- node mentioned. We may assume that one of ,a Sue,_ lays down that when two distinct node the rule phrases are identical) a new state may be formed in which r and r' are. replaced by a single referent connected with all tEe refEents to-which either - r-or - rt were linked.</Paragraph>
    <Paragraph position="17"> Yet, in the case of formulae of the predicate calculus such as f(a, b), f(b, a), if a or b lack denotation then the formulae</Paragraph>
    <Paragraph position="19"> as wholes seem to be either both false or both meaningless, but not interestingly different.</Paragraph>
    <Paragraph position="20"> Reichenbach (1947: 549) has offered a logic which includes representations of sentences about fictional entities, but in his system the symbol-arrays corresponding to NPs having fictional referents are quite different in kind from those corresponding to NPs having real referents. There is no trace of such a distinction in the syntax of natural languages. In my theory, the NPs Raskolnikov and Richard. Nixon work in exactly the same way as each other -w- null they each pick out one of the referents in the hearer's topicon -- so it is natural that the NPs are syntactically parallel. The fact that the referent of Baskolnikov will have the property $(fictional) while that of Richard Nixon has the property</Paragraph>
    <Paragraph position="22"> (28) from sentences about Richard Nixon and Spiro Agnew syntactically than is the fact that the referent of Raskolnikov has the property J(~ussian) while that of: - Nixon has the property SS(~merican). - null 3'~he topicon theory thus seams to make some sense of the ontological views of Meinong (79l3) and the early Russell (cf. Linsky 1967: 2). Meinong was troubled by the truth of e.g. (i) Pegasus does not exist.</Paragraph>
    <Paragraph position="23">  ---since, if Pegasus really does not exist, there appears to be nothing which (i] can be about, and thus (i) cannot make a true statement. Meinorig therefore suggested (382-3, 491) that, although it was true of only .some definite descriptions that their denotata actually existed, the denotatum of any'de~inite description had quasi-existence, and this was enough-for an entity to serve as the sub~ect of a statement. In our terms, to 'denote a quasi-existent object1 is to refer to a referent; to 'denote ,an existent~ob~ect~ is to refer to a referent having a denotatum.</Paragraph>
    <Paragraph position="24"> Finally, my theory suggests why there are three categories of Austinian 'speech acts'. Austin (1962) dist'inguished (not consistently, admittedly) between locutionary acts (speaking), illooutionary acts (doing something, e.g. giving an order, - in speaking), and perlocutionar~ acts (achieving some effect, e.g. causing the hearer to perform an action, through speaking). (A number of current commentators on Austin would not agree with presentation of his distinctions; however, believe my discussion is faithful to Austin's own views in much of - How to Do Things with Words.) Why should there be just three</Paragraph>
    <Paragraph position="26"> categories of speech act, rather than two or four? Some scholars have suggested that the three-category analysis is incorrect; but I would support it. Consider the various consequences speaking. the first level, sound produced ; the production of this sound is a locutionary act. If the sound is a well-formed sentence of English which is defined by the input function for the hearer's topicon state, then that sentence produce6 a specific effect on the hearer's topicoll: the production of this effect is an illocutionary act of the type defined by the label of the topmost new referent. Thus, if the sentence adds to the topicon a referent labelled assert the %Puw--' illocutionary act is one of assertion; the 'misfiring' of an illocutionary act, as when a sentence syntactically in declarative form fails -t:o make an assertion because one of its definite descriptions fails to refer, corresponds to failure to create an assert referent in the hearer's topicon. (We have  seen that, when a subordinate node cannot be evaluated, processing of the phrase-marker stops.) The new topicon state may lead other t opicon states perhaps, to actions the hearer's part, via the successor-state relation: the production of such effects may be identified with Austin's perlocutionary acts.</Paragraph>
    <Paragraph position="27"> My theory predicts that the illocutionary force of a given sentence should be well-defined and drawn from a finite class of illocutionary types (corresponding to the possible labels of sentential phrase-marker roots), while (since ma; is non-deterministic) the perlocutionary effects may be many and various; this seems to accord with Austin's discussion.  22.</Paragraph>
    <Paragraph position="28"> Having presented my theory and discussed the respects in which it seems successful, I must now discuss its many inadequacies. Some aspects of English have been omitted from the present account simply for the sake of brevity; I believe there is no difficulty of principle in expanding my account to handle e.g. plurality, co-ordination yvith and, prepositions,  adverbs, modality, and most subordinate clauses. But a number of English constructions present greater problems. These include, for instance, negation and universal quantification. 33 For negation, one might think of treating not.as a monadic  predicate whose argument is a proposition, so that, e.g., John  did not bz a car would have the same effect on a hearer's 0vuu'-- Mtopicon as - John ba a oar, followed by the creation of a 0-MM 32~n the framework of my theory, the ~locutionary/illocutionary distinction becomes rather parallel to the distinction between seein and seein as which exercised Wittgenstein and other ph d osop ears. *e duck-rabbit picture (cf. Hanson l958) if light reflectedprom that picture stimulates my optic I eee it as a duck, if this stimulation leads to the :::;ion of a ~(ZECI~) - re*erent in topicon.</Paragraph>
    <Paragraph position="29"> 33The fact that these two constructions should both be problematic is no coincidence. We can handle sentences whose translations into predicate calculus involve existential'quantification, e.g. '&amp;r)(c(x) &amp; b(;i, 5))' for J bat a car; ' -3x-@ is interchangeiibre wiFh x , 80, i&amp; could hgnG negarion, we should be able to ha'dle universal quantification. not referent dominating the bx referent just created. But F-f&amp;quot; then it would make sense to speak of the sr which John didn't hML 7-bx , n whereas in its commoner sense John did not bs a car does --- -not imply the existence of any particular unbought car. Again, one might think of interpreting e.g. All love John as 0uu-- -creating a love referent dominating each pair (r r ) of refer- -1 ' -2 ents in the hearer's topicon such that r is (r) and r</Paragraph>
    <Paragraph position="31"> is the referent representing John. But this would be quite inadequate: the sentence is about, not the particular girls the hearer knows of when he hears it uttered, but all girls whatsoever. A related point is that the meory does not handle the generic sense of definite NPs, as* The ele_phant is a  Other difficulties are with yes/no questions, the between and interpretations of complement clauses (- d* - was uated v. dancin@; - was ) with trF-functional connectives such as if, or, with conjunctions such as but v. and, rm, An+ - v. because whose appropriateness depends on a given proposition  constituting evidence for or against the truth of another, and with comparative and superlative constructions. 9&gt; 340ne approach to these problems might involve introducing referents representing muniversalsf (in the logical sense), so that for an individual referent to have the property SS(girl) or g(e~) would be for that referent to be one of munordere set of individual referents dominated by the referent representing the universal irl or e-t (in which case the notion of 'labelling' proposl % xonal referents might be dropped). Then the propositional referent created by All irls lov John would link the referent representing the unlversa -+?F wi h the individual referent $or John. It remains to be seen whether an adequate eolution can be produced along these lines.</Paragraph>
    <Paragraph position="32"> &amp;quot;I am not sure whether the opaque/transparent reference distinction belongs on this list. I am inclined to explain the Another deficiency of the present theory is that there are phrase-markers generated by (17) whose effects are not specified by R1-10, e.g. phrase-markers in which NP is rewritten wh S. I hope that an account of the unexplained constructions in the above list may turn out to involve uses for the phrase-markers which are not handled by R1-10, but I have no idea whether this will be so, 23. I am not at present clear how to adapt my theory to account for these constructions, and, since they include some very basic ones, my discussion of the nature of the automaton underlying a hearer's linguistic abilities may be worthless. However, although my theory may be rejected, it would seem that there must be some adequate theory of the human comprehension of language in terms of automata whose states are of finite complexity. An account &amp;quot;of natural-language semantics in terms of infinitely large sets of, in general, infinitely complex possible worlds cannot be the whole truth about how finite human beings understand language. I hope, therefore, that the inadequacies of the above account may spur others to improve on my work.</Paragraph>
    <Paragraph position="33"> two senses of e.g. John is lookin a the dean by saying that - -the hearer's topicon wll~codreferents representing referents in John's to icon (as well as referents representing'F. null ects m Ehetsl % e world). and that while. in the trans~arent  sense, the dean picks out- one of the ordink referents, in the  -opaque sense it picks out one of the referents. representing John's referents. But clearly this needs to be spelLed out more fully than I have done. A D. Phil. thesis currently being prepared by Ephraim Borowski of Hertford College, Oxford, incorporates some promising lines of attack on a number of these problems.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML